Sign In  |  Register  |  About San Rafael  |  Contact Us

San Rafael, CA
September 01, 2020 1:37pm
7-Day Forecast | Traffic
  • Search Hotels in San Rafael

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Could AI Threaten The Grid? Companies Like BEN Are Bringing Efficient AI Technology To The Forefront

--News Direct--

By Meg Flippin, Benzinga

Artificial intelligence is changing the world, but it’s coming at a heavy cost. Training and using AI models requires power – lots of power, which is taking an increasing toll on the national infrastructure, the cost of operations and the environment. Just one request on ChatGPT requires almost ten times more electricity than a Google search, and the AI service has a daily power consumption roughly equal to 180,000 U.S. households. Furthermore, a single ChatGPT conversation uses almost 17 ounces of water – a little more than a standard bottle of drinking water… and that’s just one AI model in a single AI service.

We’ve all seen the AI market expanding at unprecedented rates, with companies vying for a piece of the exploding market. To support this tidal wave of technology, data centers are being built as fast as possible, requiring enormous amounts of specialized processors, massive security infrastructures and lots of electricity.

Over the next decade, electricity demand from these types of data centers is projected to double and by 2040, 14% of global emissions will come from the Information and Communications Technology (ICT) industry, driven by these infrastructures. Over the next 15 years, Amazon.com Inc. (NASDAQ: AMZN) alone is expected to spend more than $150 billion building new data centers to support its own AI efforts.

The U.S. power grid simply won’t be able to handle the increased load without significant investment – Goldman Sachs pegs the required investment at $50+ billion. This comes at a time when the nation is already committed to significant investments to upgrade the grid ($22 billion since 2021) to support the growing demands arising from national initiatives to transition away from natural gas appliances, EV market expansion, crypto mining operations, domestic manufacturing and the increasing need to safeguard against disruptions caused from extreme weather events or heightening risk of cyberattacks.

Solutions To Big Problems Can Be Surprisingly Small

We've faced similar challenges before; in 2023, we transitioned from incandescent light bulbs to energy-efficient alternatives at the federal level to alleviate power demands on our national infrastructure. Although no individual light bulb posed a substantial problem, the sheer volume made a significant impact. Similarly, AI applications are now emerging, showcasing a wide range of uses and potentially being limited only by their efficiency at scale.

The current state of AI mimics the introduction of electricity as it’s posed to enable major new industries and drive economies. Today AI relies heavily on Graphics Processing Units (GPUs), which are specialized processors originally designed to accelerate graphics rendering. The parallel structure of GPUs is also ideal for traditional AI model training of applications and is used broadly in the Artificial Intelligence of Things (AIoT), which raises efficiency concerns at scale. AI companies are effectively over-deploying the most advanced, energy-intensive processors to fulfill some of their simplest application needs. While a one-size-fits-all approach can work in early AI applications, it simply can’t be the sustainable standard for all AI implementations.

The inability to adapt the security, scalability and efficiency of AI solutions to specific applications is not unlike driving a tractor trailer to pick up your groceries. While it will certainly do the job it’s not optimized for most tasks as it is tremendously inefficient as well as expensive. It’s this mismatch between application needs that leads to a huge array of unintended consequences. The growing demand for AI is undeniable, but when AI relies on GPUs, the resultant applications overburden our already fragile infrastructure.

AI companies need to look for alternative ways to bring application needs into alignment to deviate away from this path that could threaten our infrastructure. Companies like Brand Engagement Network Inc. or BEN (NASDAQ: BNAI), realize this and have optimized their solutions to deliver the power and performance of AI while doing it in a way that can be scalable and supportable.

BEN’s ELM – A Solution To The Power Problem

So how does BEN do it? Through its Efficient Language Models (ELMs): a combination of sectioning and optimization of language models for specialized tasking. This patent-pending technology concentrates on efficiency and application specialization, which contrasts with more traditional LLMs like those used by OpenAI’s ChatGPT that attempt to generalize everything into an indiscriminate model for generative purposes.

Although this may seem like a small distinction, the computational and processing power required in each approach differs significantly. When traditional LLMs utilize all-inclusive models, it means their solutions are not defined. They task their AI solution to address all needs of all challenges or applications. Not only does this increase the likelihood of generated errors but it also demands massive parallel processing and, when operating with the motive of timely responses, necessitates the use of GPUs. BEN’s ELM, on the other hand, focuses on defined application needs and allows a secure, small footprint, and concentrated solution. This means that solutions targeted with the ELM can run with the limited resources of CPUs, which are more readily available, significantly lower cost, and use less processing power.

Dependencies on CPUs provide many more deployment options, including SaaS, Private Cloud, Mobile, and On Prem solutions where industries such as Healthcare and Financial Services have struggled to minimize the potential risk of data breaches and leakages. Typically, CPUs are significantly cheaper to deploy & operate, already established in the market, and most importantly, available in large quantities. This is not the case with GPUs, which are in the midst of an availability issue that has even forced Elon Musk to get creative with the procurement of these processing units for his various companies.

ELM + RAFT: Powerful Yet Efficient Combination

BEN’s ELM also augments RAFT (Retrieval Augmented Fine-Tuning) systems to ensure its applications are reliable, predictable and efficient. A significant challenge posed by AI is the risk of ‘hallucinations’ where AI gives misleading or outright false answers as a result of the AI being built on unknown data sources and designed to generate a response no matter what. Hallucinations are a lot like the wasted heat energy from incandescent lights. They still demand the same power to generate the response but are an unintended consequence of traditional LLM technology. Some estimations indicate that hallucinations can occur as often as 27% of the time. BEN’s ELM draws from carefully selected and validated data sets, meaning false information cannot be presented or retrieved unless explicitly intended. "More than generative AI, we like to call it retrieval AI," said Paul Chang, BEN’s Co-CEO, in a recent interview. BEN’s use of much smaller data parameters than larger models like ChatGPT enables it to offer AI that is scalable and can be tailored to specific use cases at a lower cost and with less energy demands and waste.

BEN’s CPU-friendly and hallucination-averse approach has not been lost on its growing customer base, which is drawn to BEN because of its innovative and purposeful AI, optimized for efficiency, scalability and security. From healthcare to financial services, BEN customers choose BEN for a multitude of reasons. AI has the potential to change the world for the better, but it has to be done in a way that is conducive to our infrastructure and our environment. With growing demands on the grid, companies like BEN are bringing powerful and impactful AI to the masses – all the while ensuring it can be supported in the long term.

Featured photo by Zosia Szopka on Unsplash.

Benzinga is a leading financial media and data provider, known for delivering accurate, timely, and actionable financial information to empower investors and traders.

This post contains sponsored content. This content is for informational purposes only and is not intended to be investing advice.

Contact Details

Benzinga

+1 877-440-9464

info@benzinga.com

Company Website

http://www.benzinga.com

View source version on newsdirect.com: https://newsdirect.com/news/could-ai-threaten-the-grid-companies-like-ben-are-bringing-efficient-ai-technology-to-the-forefront-195011391

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 SanRafael.com & California Media Partners, LLC. All rights reserved.