Your LLM is good, but could it be better?

PLUS: Generate images directly from Google Search

TOGETHER WITH

Good morning, human brains. Welcome back to your daily munch of AI news.

Here’s what’s on the menu today:

  • Steer your AI in a more personalized direction 🚀 🤖 

    NVIDIA’s released a new method to customize your LLM’s attributes.

  • Enhance your LLM’s robustness and performance 🦾 ⚙️

    Google unveiled Batch Calibration to refine LLM’s in-context learning.

  • Generate images and writing drafts from Google search 🖥️ 🧠

    Google announced new features for its SGE (Search Generative Experience).

MAIN COURSE

Get more control over your LLM’s outputs 🚀 🤖

Last month, we reported on NVIDIA’s new partnership with Anyscale. Allegedly, the goal is to accelerate the process of developing generative AI models.

Or, world domination?

On Wednesday, NVIDIA unveiled NeMo SteerLM — which allows you to customize your LLM’s responses.

LLM = Large Language Model. ChatGPT is the most widely used LLM, but it is entirely possible to build or fine-tune your own bespoke model.

Sounds vague, how exactly can it help me?

It allows you to alter real-time attributes during inference.

Huh?

Inference is using a trained AI model to make predictions or decisions.

Attributes are characteristics that you can use to describe something, like “funny” or “helpful.”

More on attributes, here:

So, it gives me more power over my model’s outputs?

Yes, NVIDIA claims it wasn’t possible to alter your attributes with previous methods. While that’s not entirely accurate, SteerLM appears to give your LLM more customization options.

How is it different from other training methods?

It claims to address current LLM limitations by providing nuanced responses instead of the common, short, mechanical outputs generated through supervised fine-tuning (SFT).

What can I use it for?

It’s flexible and allows for various use cases like gaming, education, and more.

It also allows you to serve multiple teams with personalized capabilities from a single model.

How do I get my hands on it?

Also, check out this customized 13B LLama 2 model on Hugging Face.

FROM OUR PARTNERS

Apply to hundreds of jobs in 30 seconds. 🤝

In the age of AI, no one should spend hours clicking "upload resume" over and over again.

You should be spending your time prioritizing companies that have already shown interest in you, not waving your hand just trying to be seen.

ApplyAll maximizes your chance of getting an interview by using AI to apply to hundreds of relevant jobs tailored just for you.

Their bulk-apply AI applies you to 200 relevant jobs to supercharge your job search, whether you're looking to work at a startup or a big tech company.

Game the application process

Accelerate your job search without pouring time and effort into the application process.

Get interviews in your field

Bulk applying leads to more interviews, and they claim a 99% success rate to prove it.

All you need is a resume

AI takes care of the rest. ApplyAll saves you hours, even days, by automating the repetitive stuff.

Accelerate your job search, save yourself hundreds of hours, and get your dream job.

BUZZWORD OF THE DAY

Supervised fine-tuning (SFT)

A way to teach pre-trained LLMs to do new tasks using specific examples.

It’s like giving the model extra lessons on a particular topic using a practice workbook with questions and correct answers.

How it works:

The model learns general patterns from a large dataset. After that, Supervised Fine-Tuning sharpens this model for a specific task using a smaller dataset with correct answer, tweaking it to get better at giving the right responses for that task

SIDE SALAD

Enhance your LLM’s performance 🦾 ⚙️

A couple of weeks ago, we covered Google’s Assistant With Bard, its new upcoming AI combo.

On Friday, Google released Batch Calibration (BC) — a technique to refine LLM’s in-context learning (ICL).

I know what that is. I know everything.

Just in case, in-context learning is a method to leverage an AI model’s pre-existing knowledge to perform tasks it wasn’t specifically trained for.

So, how does it refine in-context learning?

Batch Calibration mitigates biases in ICL, which is the model’s inclination to favor certain outcomes based on the provided context.

Why does this matter?

It enhances performance across a variety of natural language understanding and image classification tasks. This enables state-of-the-art performance and unprecedented robustness.

Or so Google claims.

When would I use this?

Batch Calibration is effective for image classification tasks and prompt engineering. It requires less expertise to fine-tune prompts and promises to facilitate easier LLM adaptation.

So I should use it?

It’s worth looking into. It claims to require no training and keeps computational costs very low.

A LITTLE SOMETHING EXTRA

Google Search generates AI images 🖥️ 🧠

In August, we covered Google’s SGE (Search Generative Experience) updates. It brought AI-powered definitions, coding info, and article summaries to Google Search.

On Thursday, it introduced more AI features for SGE. It creates first drafts for your writing projects and generates images for your search queries.

Wait, it writes for me?

When you’re researching a topic, SGE helps you create a first draft about it. Then, you can edit it for length, tone, and more. Once you’re happy with the draft, you can export it to Google Docs, Gmail, and more.

And what about the pictures?

You can generate images by searching, which would be helpful if you can’t find the photo you're looking for. Then, you can customize the images by editing the description.

How do I get these features?

You have to opt into Google’s SGE experiment. The features are still rolling out, so if it’s not available now, it will be soon.

MEMES FOR DESSERT

It’s like comparing apples and oranges. Google has over 85% of the search market share. But ChatGPT’s “Browse with Bing” eliminates its data being limited to 2021 and is very useful for quick, easy searches. More on Browse with Bing, here

YOUR DAILY MUNCH

Think Pieces

LLMs, where to begin? Here’s a comprehensive guide explaining what they are and how to deploy your own.

Do LLMs tell the truth? LLMs contain a specific “truth direction” that causes them to lie (or spew out wrong information).

AI-powered robots could lead to more biodiverse farms. Autonomous farming robots use GPS to plant and harvest different crops.

Startup News

Google takes legal responsibility for your AI usage. It announced legal protection for its customers using its generative AI services.

OpenAI’s revenue surpasses $1.3 billion annualized rate. This means its revenue has increased 30% since summer.

Research

Jigsaw — a system that enables designers to leverage AI foundation models seamlessly for creative tasks.

Lemur — an LLM that combines natural language and coding proficiency to serve as adaptable language agents.

Prometheus — an open-source LLM designed for fine-grained text evaluation, addressing limitations posed by closed-source LLMs.

Tools

Lakera — cybersecurity for your LLM applications.

Decode — AI tool that tells you how to lower your tax bill.

Mentor.AI — chat with industry-specific AI chatbots.

Headshot AI — an open-source tool that generates AI headshots from a photo.

Insanely Cool ToolsDiscover insanely cool tools, websites and apps from around the web. Join 20,000+ startup founders, early adopters and tech enthusiasts.

TWEET OF THE DAY

The President of OpenAI retweets a way to use DALL-E3 and ChatGPT to create your own fantasy world. More on DALL-E 3, here.

Tag us on Twitter @BotEatBrain for a chance to be featured here tomorrow.

AI ART-SHOW

Until next time 🤖😋🧠

What'd you think of today's newsletter?

Login or Subscribe to participate in polls.