🧠 Meta enters the AI chip wars

PLUS: Healthcare AI Beats ChatGPT

Good morning human brains, welcome back to your daily munch of AI news.

Here’s what’s on the menu today:

  • $50M Healthcare AI 💊

    A new AI company, Hippocratic AI, raised $50 million in funding to make an AI system for healthcare.

  • NVIDIA’s New Generative AI 🧠

    NVIDIA and ServiceNow team up to make IT support more efficient.

  • Meta’s new AI chip 💽

    The AI chip wars wage on! Meta unveiled its new chip specifically designed for running AI programs.


AI takes a health check 💊

But how’s the bedside manner?

Hippocratic AI is a new company that has raised $50 million in funding to create an AI system specifically for healthcare.

What is their aim?

To equip AI with medical terminologies, patient interaction skills, and a deep understanding of healthcare subtleties.

Early tests show Hippocratic AI’s model outperforming GTP-4 in several healthcare domains:

Hippocratic AI outperforms GPT-4 in these areas

Despite Hippocratic AI’s promising early results, the path toward industry-specific AI models is still largely uncharted, teeming with uncertainties and potential.

Hippocratic AI’s CEO, Munjal Shah, concedes their approach may not be universally applicable.

Our Take: As we witness AI’s evolution and its growing influence across many sectors, it’s essential to stay inquisitive, grounded in AI fundamentals, and ready to adapt.



Large Language Models (LLMs) are AI systems that understand humand language and can be used for tasks like summarizing text, translating languages, answering questions, and helping with coding.

They’re called “Large Language Models” because they are literally gigantic. The models are trained on huge amounts of data.

ChatGPT is the most famous example of an LLM.


Productized Video and Design service for small teams and busy founders.

If you’re a tech team looking for a better way to create stunning videos and graphics for your business, you need to check out MotionDock, the subscription-based creative agency that gives you access to an elite team at a fraction of the cost.

  • You get an elite team of 5 creatives, including video editors, motion designers, graphic designers, animators, and illustrators.

  • The flexibility to choose the hours you need, no long-term commitments, and the freedom to cancel anytime.

  • A service crafted specifically with small teams and busy founders in mind.

  • A risk-free pilot project to ensure we're the right fit for each other.

  • Included support from a project manager and creative director to streamline your work.

More info about our pricing and plans: https://motion-dock.com/home-2 

This could be the beginning of something amazing! Could mean an end to unreliable freelancers, expensive agencies, and inconsistent designs.

Book a call with the cofounders: https://calendly.com/ahmadalikhwaja/30min

(To claim your 50% discount, please mention that you are coming from the Bot Eat Brain newsletter in your correspondence with the team)


NVIDIA’s new generative AI 🧠

Now, NVIDIA and ServiceNow have announced a partnership to build generative AI for businesses.

Why do we care?

The AI promises to customize the entire support process for businesses. It could improve customer service, aid employee learning, and enable developers to create safe, secure, and company-specific language models.

Here are the highlights:

1/ NVIDIA and ServiceNow’s AI could mean easy access to intelligent virtual assistants and chatbots making IT support more efficient.

2/ Businesses could tweak these AI chatbots to meet their unique needs, turning AI into a customized resource that knows just what to say and when.

3/ The AI could act as a personal career coach, identifying learning and development opportunities tailored to your profiles and queries.

4/ Developers could add safety and security features, ensuring that these AI chatbots stay in their lanes and don’t go all terminator on us.

Creating custom LLMs with NVIDIA

As exciting as this sounds, NVIDIA and ServiceNow haven’t given any information as to when we can expect to use this generative AI for ourselves.

So mysterious…

What this means: This partnership could spearhead a new age of enterprise AI, making our work lives more efficient and streamlined. With AI handling the heavy lifting, who knows what new horizons we could explore?


Meta enters the AI chip wars 🔥

In a move to fortify its AI capabilities, Meta launched its first AI chip.

Last week, we discussed Microsoft and AMD partnering to take on NVIDIA’s current AI Chip market domination.

With Microsoft, Google, and Amazon making their own chips now, Meta’s new chip means even more competition for NVIDIA.

Meta’s new MTIA chip

Here’s the scoop:

1/ Meta unveiled its new chip, the Meta Training and Inference Accelerator (MTIA). This chip is specifically tailored for running AI programs known as "deep learning recommendation models."

2/ This comes as Meta aims to supercharge its computing capabilities for AI, including future plans of building an AI-optimized data center.

3/ Meta also announced the Meta Scalable Video Processor (MSVP), a chip specifically designed to handle video processing more efficiently.

4/ Developers can write code for the MTIA chip in either the PyTorch or C++ programming languages or use a new language made specifically for the chip called KNYFE. This flexibility will help developers to create more optimized applications.

What this means: Meta's move into AI chips marks an escalating trend of harnessing the power of AI to personalize and enhance user experiences across digital platforms. As the story unfolds, we might witness a significant transformation in the way we use social media and consume digital content.



MIT Researchers say that AI models fail to reproduce human judgments about rule violations

GPT-4’s Maze Navigation: A Deep Dive into ReAct Agent and LLM’s Thoughts

How to Train AI to Play a Game: Training Learning Reinforcement Models using PPO Algorithms

Startup News

Stability AI releases Stable Animation SDK, a text-to-animation tool

Anthropic AI expanded Claude’s context window to 100,000 tokens of text

Poe allows you to build your own chatbot in their app. Developer access officially opens


Google Research: Larger Language Models Do In-Context Learning Differently

University of Rome: Accelerating Transformer Inference for Translation via Parallel Decoding


YOLO NAS: State-of-the-art object detection

Nack: Enjoy the power of AI from the comfort of your phone.


Jailbreaker figures out GitHub Copilot’s secret rules:



Until next time 🤖😋🧠

What'd you think of today's newsletter?

Login or Subscribe to participate in polls.