HomeAssets/BlogsUncovering the secrets of effective demand forecasting in the B2B industry

Uncovering the secrets of effective demand forecasting in the B2B industry

Related stories

Key Fed inflation increased by 0.4% in January, up 2.8% from a year ago

Inflation increased in January as expected, according to a...

Maximizing B2B Lead Generation through Social Media Strategies

In the realm of B2B marketing, the landscape has...

Strategies for Navigating Full Funnel Lead Generation Challenges

In B2B marketing, navigating the full funnel for lead...

Leveraging B2B Buyer Intent Data: Transforming Business Dynamics

In the realm of B2B sales and marketing, the...
spot_imgspot_img

Yann LeCun, the chief scientist at Meta and a pioneer in deep learning, stated that he thinks it will take decades for current AI systems to resemble sentient beings with common sense and the capacity to do more than just creatively summarize massive amounts of text.

His viewpoint differs from that of Nvidia CEO Jensen Huang, who declared recently that AI will surpass humans in several mentally demanding tasks in less than five years, making it “fairly competitive” with humans.

“I am familiar with Jensen,” LeCun declared at a recent celebration honoring the Fundamental AI Research team’s tenth anniversary at the parent company of Facebook. LeCun claimed the CEO of Nvidia stands to gain a lot from the AI frenzy. “He’s providing the weapons in this AI war.”

Regarding researchers trying to create artificial general intelligence (AGI), or AI comparable to human intelligence, LeCun remarked, “If you think AGI is in, the more GPUs you have to buy.” Researchers at companies like OpenAI will require more Nvidia computer chips as long as they are pursuing AGI.

LeCun predicted that “dog-level” or “cat-level” AI will probably reach society years before human-level AI. The current emphasis of the technology sector on language models and text data will not be enough to develop the sophisticated AI systems that resemble humans that scientists have been envisioning for decades.

LeCun declared, “Text is a very poor source of information,” pointing out that the volume of text used to train current language models is probably too large for a human to read in 20,000 years.

LeCun added, “Train a system on the equivalent of 20,000 years of reading material, and they still don’t understand that if A is the same as B, then B is the same as A. There are a lot of really basic things about the world that they just don’t get through this kind of training”

Meta is currently not placing a significant wager on quantum computing, in contrast to Microsoft, Google, and other tech behemoths.

Subscribe

Latest stories

spot_img