- Elon Musk’s xAI has open-sourced the base code of Grok AI model, but without any training code. The company described it as the “314 billion parameter Mixture-of-Expert model” on GitHub.[1]
- Apple Announces MM1: A Family of Multimodal LLMs Up To 30B Parameters that are SoTA in Pre-Training Metrics and Perform Competitively after Fine-Tuning.[2]
- Microsoft tells European regulators Google has an edge in generative AI.[3]
- Nvidia’s Jensen Huang, Fed’s Powell may rock markets this week.[4]
Check Comments for Sources
Leave a comment