First Look Into Elon Musk & xAI's Grok

Dear Artisan,

Elon Musk’s xAI is starting to make waves in the AI space.

From open sourcing the entire Grok model a few weeks ago to announcing it’s general availability to all X Premium users, it seems like Grok will finally be an everyday, all-purpose tool for the masses.

Today, we get its biggest release yet: Grok 1.5, a majorly upgraded version of the previously released model.

Let’s dive in.

Hot Off The Press

The One Big Thing

Every day there is a new model with a chart like the one above, boasting about how performant this new technology is and how much better it is than everything else out there.

And then we all just end up using ChatGPT for all of our needs anyway.

For some reason, Grok feels different. For one, the AI actually talks (trolls) like Elon Musk, so it is more of a treat to talk to than ChatGPT. More importantly, it just got a bunch of upgrades that will allow it to take in more data, provide better results, and possibly access real time X data as well.

xAI just announced the release of Grok 1.5 to the public later this week. Let’s break it down.

The Basics:

  • Grok-1.5 is an AI model with improved long context understanding and advanced reasoning and will be available on the 𝕏 platform this week

  • Grok 1.5 demonstrates superior capabilities in coding and math-related tasks, notably achieving a 50.6% score on the MATH benchmark and a 90% score on the GSM8K benchmark

  • Has the ability to process contexts up to 128K tokens long, with a 100% recall rate for information retrieval across its entire context window, indicating a significant increase in memory capacity.

  • Built on custom distributed training framework utilizing JAX, Rust, and Kubernetes, Grok-1.5 benefits from a robust infrastructure designed for efficiency, scalability, and reliability in large-scale AI model training.

  • Outperforms its predecessor Grok-1 and other notable models across various benchmarks

Long contexts are going to be an increasingly present term in AI discussions. The longer the context length a model can process, the more data you can feed it and the more sophisticated its responses will be.

This is the first shot a foundation model has of going as mainstream as GPT, given the distribution both Elon Musk and the X platform have. While models like Anthropic’s Claude have broken through in terms of performance, their distribution does not come anywhere close to that of ChatGPT.

We have our first contender that can seriously change that. It’s going to be a big week.

AI with your phone on a phone stand… weird but cool

Another podcast (not as good as the AI Renaissance, but still good ;)

Tools

Must have tools for every artisan to add to their toolkit:

Deep Tech

The newest and coolest in the research world that you need to know about:

Closing Thought

AI arms race is getting so real…

Choose your fighter, Mortal Kombat style: Elon, Zuck, SamA, Satya, etc.

Work With Us!

The AI Renaissance is coming and we are building the best community of the people making it happen.

Contact us to sponsor your product or brand and reach the exact audience for your needs across our newsletter and podcast network.