• Bot Eat Brain
  • Posts
  • The Beatles' new song made possible with AI, "Now And Then"

The Beatles' new song made possible with AI, "Now And Then"

PLUS: Anthropic, Amazon, and Google's sloppy love triangle

TOGETHER WITH

Good morning, human brains. Welcome back to your daily munch of AI news.

Here’s what’s on the menu today:

  • Is Beatlemania still alive 60 years later? AI says so 🎙️ 🤖

    The Beatles extracted John Lennon’s vocals from old demos with AI.

  • This robot dog does parkour, runs, jumps, and now it talks 🐶 🙊

    Spot, talks with GPT-powered text-to-speech capabilities.

  • Anthropic, Amazon, and Google’s drama-filled love triangle 🤷‍♀️ 🤖 🤷‍♀️

    Google reportedly invested $2 billion in Anthropic.

MAIN COURSE

The fifth Beatle? AI, of course 🎙️ 🤖

In September, we reported on StableStudio. It is Stability’s AI-powered music and sound effects generator.

On Thursday, The Beatles launched a teaser for their “last” song. It’s called “Now and Then.”

Aren’t they dead, though?

John Lennon and George Harrison are, but Paul McCartney and Ringo Starr are reproducing an old recording with AI.

So it’s an old song, but a new song?

The song is called "Now And Then." It originated from a batch of unreleased demos written and recorded by John Lennon.

In the mid-1990s, The Beatles reproduced and released “Free As a Bird” and “Real Love” from the batch.

Paul McCartney, Ringo Starr, and George Harrison initially encountered technological hurdles while working on "Now And Then" which stopped them from releasing it.

So, how are they using AI?

They used AI to separate Lennon’s vocals from the piano track, originally recorded in the 1970s. Then it enhanced the clarity of the vocals to get it to a point where it could be used in a record.

It also isolated elements like Harrison’s three-decade-old guitar recordings so that they could be used in the song.

If it’s John Lennon’s demo, how is it a Beatles song?

Starr played drums, McCartney played bass, and both of them added backup vocals to the song. Then they integrated backing vocals from original Beatles tracks into the song.

McCartney also collaborated with Giles Martin on a string arrangement.

How do I listen to it?

The song will launch on November 3 as a vinyl and cassette single alongside “Love Me Do,” the Beatles’ first single.

They’re also releasing a music video and a 12-minute video detailing the story of the new song the day before the song’s debut.

SPONSORED BY GROWTHSCHOOL

Master ChatGPT & AI Hacks for FREE in Just 3 Hours

GrowthSchool’s 3-hour FREE ChatGPT & AI Workshop (worth $49) will help you become the top 1% of AI users 🤯

You can’t miss this if you are a:

✅ Working Professional Seeking AI Proficiency.

✅ Entrepreneur or Freelancer Eager to do more with your time.

✅ Creative Mind, Designer, Coder, or Student looking to upskill.

⏰ Hurry! Registrations Close in 24 Hours. ⏰

BUZZWORD OF THE DAY

Boston Dynamics’ “Spot”

Spot is an AI-powered, four-legged robot that was first revealed in 2016. It was the company's first commercial robot.

It can climb stairs and traverse rought terrain, map its environment, sense and avoid obstacles, and open doors.

It’s used to provide insights into routine operations, site health, or potentially hazardous situation.

It costs $74,500.

SIDE SALAD

Talk to this 4-legged robot dog 🐶 🙊

On Friday, we reported on NVIDIA Research’s HITL-TAMP. It’s a robotic learning method that combines two standard approaches to train robots more efficiently.

More bot-related, nerd fodder?

Of course. Boston Dynamics’ trained its robot dog, Spot, to talk. The four-legged robot can walk, run, jump, dance, parkour, and now it can communicate.

How does it talk?

Boston Dynamics used OpenAI’s ChatGPT API and other open-source LLMs to train Spot’s responses.

It added a speaker to Spot, incorporated text-to-speech features, and made its gripper mimic speech movements like a puppet.

What does Spot sound like?

Like anything you want. Spot assumes various personas including a 1920s archaeologist, a bratty teenager, a Shakespearean time traveler, and more.

So he’s a fancy talking robot, why would anyone want him?

Beyond its playful appearance, Spot has capabilities like opening doors and surveillance, leading to its adoption by the police and military.

Oh… So he always does what you want then, right?

Not at all. During tests, Spot surprised the team with its responses, like recognizing older Spot models as its “parents” and mistakenly suggesting another robot was designed for yoga.

A LITTLE SOMETHING EXTRA

Anthropic ain’t picky 🤷‍♀️ 🤖 🤷‍♀️

In September, we reported on Anthropic’s new partnership with Amazon. Amazon invested $4 billion for 49% of Anthropic’s ownership.

On Friday, Google reportedly backed Anthropic with more funding. It invested $2 billion in Anthropic.

Love triangle, anyone? 🤖 🍿

Billions?

You bet. Google’s investment comprises an upfront $500 million, with a subsequent $1.5 billion to be invested later.

How much is Anthropic worth now?

Its valuation, following this funding, ranges between $20 billion and $30 billion.

Why is Google shacking up with Anthropic?

Neither company has given an official statement yet, but in leaked documents, Anthropic admitted its aim is to raise $5 billion to directly challenge OpenAI.

Want to learn more about Anthropic’s contributions to the AI community? Start with these previous editions of Bot Eat Brain:

MEMES FOR DESSERT

Someone put a horse head on Spot. The result is… well… you be the judge.

YOUR DAILY MUNCH

Think Pieces

A prompt engineering experiment with DALL-E 3. A look at how its prompts offer different results, and how it handles unexpected changes.

Leading AI researchers, including Geoffrey Hinton’s, paper on AI Risks. How AI has developed in recent years and what to prepare for in the future.

What you need to know about the leading AI firms. How OpenAI, Anthropic, Cohere, and more are performing against each other.

Startup News

Meta posted an official “How-to” on using Llama 2. Includes info about the initial installation, how to work with it, how to fine-tune it, and more.

CentML, a startup that aims to decrease AI’s costs, gets $27 million. The investors include NVIDIA, Microsoft Azure’s AI VP, and more.

Research

CommonCanvas — a text-to-image AI model trained using Creative-Commons images.

Wonder3D — a 3D reconstruction method that uses a cross-domain diffusion model to generate multi-view maps.

CLEX — a method that extends the context lengths of LLMs beyond the training sequence length for better performance in practical tasks.

Tools

StoryBee - an AI platform to easily create kids’ stories.

UXSniff - observes and provides analytics about your site’s users.

Resolve AI - GPT-powered customer service bot that handles half your tickets.

Docue.ai — AI assistant for your sales quotes, proposals, and more.

RECOMMENDED READING

If you like Bot Eat Brain there’s a good chance you’ll like this newsletter too:

👨 The Average Joe — Market insights, trends, and analysis to help you become a better investor. We like their easy-to-read articles that cut right to the meaty bits.

TWEET OF THE DAY

Geoffrey Hinton, the Godfather of AI, along with several other leading AI researchers, published a paper on the risks of AI in the near future.

This split the AI community into two groups. The first group believes that AI, in its current form, poses serious risks to humanity.

The second group believes that current AI is too “dumb” to achieve AGI, and it would take a more advanced form to pose a threat to humankind.

What side are you on?

Tag us on Twitter @BotEatBrain for a chance to be featured here tomorrow.

AI ART-SHOW

Until next time 🤖😋🧠

What'd you think of today's newsletter?

Login or Subscribe to participate in polls.