AI platorm Groq reduces presence-breaking lags

[One of the many aspects of interactions with media technologies that can ‘break’ presence is lag, the delay between the user’s action and the technology’s response. In a February 20, 2024 essay in Stratechery, Matthew Ball explains the need for reducing lag in interactions with AI:

“[T]he closer an AI comes to being human, the more grating and ultimately gating are the little inconveniences that get in the way of actually interacting with said AI. It is one thing to have to walk to your desk to use a PC, or even reach into your pocket for a smartphone: you are, at all times, clearly interacting with a device. Having to open an app or wait for text in the context of a human-like AI is far more painful: it breaks the illusion in a much more profound, and ultimately disappointing, way. Groq suggests a path to keeping the illusion intact.”

As the story below from Tom’s Guide explains, a new AI platform called Groq reduces the likelihood of presence-breaking lags. Note the link to the free demonstration. –Matthew]

Forget ChatGPT — Groq is the new AI platform to beat with blistering computation speed

AI has a need for speed that Groq wants to deliver

By Christoph Schwaiger
February 20, 2024

Groq, a company that created custom hardware designed for running AI language models, is on a mission to deliver faster AI — 75 times faster than the average human can type to be precise.

Speed is very important with it comes to using AI. When you’re having a conversation with an AI chatbot you want that information to happen in real time. If you’re asking it to compose an email you want the results in seconds so that you can send it off and move on to the next task.

Groq (not to be confused with Elon Musk’s Grok chatbot — and no they aren’t too happy with the similar names) specializes in developing high-performance processors and software solutions for AI, machine learning (ML), and high-performance computing applications.

So while the Mountain View-based company (currently) doesn’t train its own AI language models it can make ones developed by others work really fast.

How does it achieve this?

Groq uses different hardware than its competition. And the hardware they use has been designed for the software they run, rather than the other way around.

They built chips that they’re calling language processing units (LPUs) that are designed for working with large language modes (LLMs). Other AI tools usually use graphics processing units (GPUs) which, as their name implies, are optimized for parallel graphics processing.

Even if they’re running chatbots, AI companies have been using GPUs because they can perform technical calculations quickly and are generally quite efficient. Building on the example of chatbots, LLMs such as GPT-3 (one of the models that ChatGPT uses) work by analyzing prompts and creating text for you based on a series of predictions about which subsequent word should follow the one that comes before it.

Since Groq’s LMUs are specifically designed to deal with sequences of data (think DNA, music, code, natural language) they perform much better than GPUs. The company claims its users are already using its engine and API to run LLMs at speeds up to 10 times faster than GPU-based alternatives.

Try it out

You can try it out for yourself for free and without installing any software here using regular text prompts.

Groq currently runs Llama 2 (created by Meta), Mixtral-8x7b, and Mistral 7B.

On X, Tom Ellis, who works at Groq, said custom models are in the works but that they’re concentrating on building out their open source model offerings for now.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z