Close Menu
Westside People
    Facebook X (Twitter) Instagram
    Westside People
    Subscribe
    • Home
    • Top News
    • World
    • Economy
    • science
    • Tech
    • sport
    • entertainment
    • Contact Form
    Westside People
    Home»Tech»Google’s PaLM 2 uses nearly five times more text data than its predecessor
    Tech

    Google’s PaLM 2 uses nearly five times more text data than its predecessor

    Avery KensingtonBy Avery KensingtonMay 17, 2023No Comments4 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    Google’s PaLM 2 uses nearly five times more text data than its predecessor
    Share
    Facebook Twitter Pinterest Reddit WhatsApp Email
    • Google’s PaLM 2 large language model uses nearly five times as much textual data for training as its predecessor, LLM, CNBC has learned.
    • In announcing the PaLM 2 last week, Google said the model is smaller than the previous PaLM but uses more efficient “technology.”
    • The lack of transparency about training data in AI models has become an increasingly hot topic among researchers.

    Sundar Pichai, CEO, Alphabet Inc. , during the Google I/O Developers Conference in Mountain View, Calif., on Wednesday, May 10, 2023.

    David Paul Morris | bloomberg | Getty Images

    CNBC has learned that Google’s new big language model, which the company announced last week, uses nearly five times as much training data as its predecessor from 2022, allowing it to perform more advanced coding, math and creative writing tasks.

    PaLM 2, the company’s new public-use large language (LLM) model unveiled at Google I/O, has been trained on 3.6 trillion tokens, according to internal documents seen by CNBC. Tokens, which are strings of words, are an important building block for training LLM, because they teach the model to predict the next word that will appear in a sequence.

    Google’s previous version of PaLM, which stands for Pathways Language Model, was released in 2022 and trained on 780 billion tokens.

    While Google was eager to show the power of its AI technology and how it could be integrated into search, emails, word processing, and spreadsheets, the company was unwilling to publish the volume or other details of its training data. OpenAI, the innovator of Microsoft-backed ChatGPT, has also kept details of the latest LLM language called GPT-4 secret.

    The companies say the reason for the lack of disclosure is the competitive nature of the business. Google and OpenAI are rushing to attract users who might want to search for information using chatbots instead of traditional search engines.

    But as the AI ​​arms race rages on, the research community is calling for more transparency.

    Since revealing PaLM 2, Google has said the new model is smaller than previous LLMs, which is significant because it means the company’s technology is becoming more efficient while accomplishing more complex tasks. PaLM 2 is trained, according to internal documentation, on 340 billion parameters, which is an indication of the complexity of the model. The initial PaLM is trained on 540 billion parameters.

    Google did not immediately provide comment for this story.

    Google He said In a blog post about PaLM 2, the model uses a “new technique” called Computational Scale Optimization. This makes the LLM “more efficient with better overall performance, including faster inference, fewer service parameters, and a lower cost of service.”

    In announcing PaLM 2, Google confirmed previous CNBC reports that the model is trained in 100 languages ​​and performs a wide range of tasks. It’s already being used to power 25 features and products, including the company’s experimental chatbot Bard. It’s available in four sizes, from smallest to largest: Gecko, Otter, Bison, and Unicorn.

    PaLM 2 is more powerful than any existing model, based on public disclosures. LLM on Facebook is called LLaMA, which is announce In February, it was trained on 1.4 trillion tokens. The last time OpenAI shared ChatGPT training volume was with GPT-3, when the company said it had trained 300 billion codes in that time. OpenAI released GPT-4 in March, and said it shows “human-level performance” in several professional tests.

    LaMDA, LLM conversation that Google foot Two years ago and promoted in February alongside Bard, it has been trained on 1.5 trillion tokens, according to the latest documents seen by CNBC.

    As new AI applications quickly reach the mainstream, so does the debate over the underlying technology.

    Mehdi Elmohamady, Senior Research Scientist at Google, He resigned in February About the company’s lack of transparency. On Tuesday, OpenAI CEO Sam Altman testified at a hearing of the Senate Judiciary Subcommittee on Privacy and Technology, and agreed with lawmakers that a new system is needed to deal with AI.

    “For a technology that is so new, we need a new framework,” Altmann said. “Certainly companies like ours have a lot of responsibility for the tools we put out into the world.”

    — CNBC’s Jordan Novette contributed to this report.

    He watches: Sam Altman, CEO of OpenAI, has called for AI stewardship

    Avery Kensington
    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleBiden cuts the upcoming foreign trip short amid a debt ceiling standoff
    Next Article Rosenthal: Blue Jays pitcher admits he was turning pitches when facing Aaron Judge

    Related Posts

    Samsung and Google Expand Galaxy XR Capabilities With Major Android XR Update

    April 10, 2026

    Android May Expand Its Edge Over iOS With New “Notification Rules” Feature

    April 3, 2026

    Google Prepares Screenless Fitbit Band to Challenge Whoop and Oura

    April 1, 2026

    Android Malware Steals Payment Card Data Using Never-Before-Seen Technique

    August 24, 2024

    Amazon is killing off a key feature on its $160 Echo after one year

    August 23, 2024

    Animal Crossing: Pocket Camp Will End Online Service in November

    August 23, 2024
    Add A Comment
    Leave A Reply Cancel Reply

    Navigate
    • Home
    • Top News
    • World
    • Economy
    • science
    • Tech
    • sport
    • entertainment
    • Contact Form
    Pages
    • Home
    • Privacy Policy
    • Editorial Policy
    • DMCA
    • About Us
    Facebook X (Twitter) Instagram Pinterest
    © © 2026 WestsidePeopleMag.com. Independent stories, culture, and community coverage. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.