so it doesn't need as many gpu's and money to train with, the reason why the US companies and market is in meltdown mode is it costs a fraction of what they used and they open sourced it while they are keeping all their tech proprietory.
I've read about it and it uses alot of the LLama architecture and is in pytorch not really anything new in terms of architecure but it does cut into the idea that one tech or two tech companies on the west coast of the of california can have secret sauce to the best models in the world.
They trained it for about as much as one or two open ai top programmers 🙂