Express Computer
Home  »  Artificial Intelligence AI  »  AI, AI Everywhere! But are we Truly Prepared?

AI, AI Everywhere! But are we Truly Prepared?

0 156

Artificial Intelligence (AI) is that buzzing technology that requires no introduction. Instead, what it requires is a huge load of power supply. A commodity that many developing countries with burgeoning populations like India still struggle to even on.

India today is paving its path to becoming a digital-first economy and making initiatives to mark benchmark technology adoptions. The Government of India’s active endeavors can be attributed to the developments. The trend does follow the course when it comes to the adoption of advanced technologies like cloud computing, edge computing, robotic process automation, blockchain, and even AI models. However, this is not a rosy picture and even the roses come with thorns. The soaring population of India with brimming power demands often poses a blockade in the strides that the authorities attempt to achieve technological feats.

The International Energy Agency (IEA), in a report Electricity 2024 Analysis and Forecast to 2026 states, “Global, electricity consumption from data centres, artificial intelligence (AI), and the cryptocurrency sector could double by 2026.” China provides the largest share of global electricity demand growth in terms of volume, but India posts the fastest growth rate through 2026 among major economies, the report adds.

As per a report from The New Yorker, Alex DeVries, a data scientist and a researcher hailing from the Netherlands, published a paper in 2023 in Joule, a journal dedicated to sustainable energy, and estimated that “If Google integrates Generative AI (GenAI) in every search, the power it normally consumes will rose to 29 billion KWh (KilloWatt Hours) annually. This is more than what many small countries consume.” The publication also reported that the famous GenAI model ‘ChatGPT’ by OpenAI consumes around 500,000 KWh on a daily basis to address its over 200 million users.

Meanwhile, Technisights, in a research note published in the previous month, highlights, “AI chips represent 1.5 percent of electricity use over the next five years, a substantial part of the world’s energy.”

A peek into the AI’s insatiable appetite

AI models are reflections of the massive database they feed on and the entire internet is on its plate. Every time a user runs a query or prompts the model for a certain search requirement, the AI runs it through the maximum accessible data in its capacity, figures out relevant touchpoints, and frames the responses as demanded by the user using its intelligent capabilities. However, not surprising that the ever-learning and self-evolving capabilities of AI models suck in more power than its search and response process. The volume of users due to the soaring popularity of the technology further adds to the power consumption.

A major area where the power demand of AI models peaks is their training period. Training of AI model is critical for enterprise users to ensure the technology accesses a relevant dataset to process queries. The exercise lights up a directional path for the artificially intelligent technology to learn and evolve in accordance. This entire process of training an AI model can range anywhere from a few minutes to several months. And, throughout the process, GPUs powering the machines keep running daylong eating into large volumes of power.

On the bright side, experts have pointed out that specialised AI models are significantly more efficient in power consumption than generic models. Therefore, dedicated AI-powered tools like photo editors, video editors, transcribing models, etc. consume less power than generic models like Google Gemini, ChatGPT, Microsoft CoPilot, etc.  

India and the ‘power’ struggle

The abovementioned highlights of the dire situation can be daunting, especially when imagining the large-scale adoption of AI across the globe. It simply paints AI as a power-hungry beast. 

In May 2024, India met a record-high power demand of 250 GW. A peek into the past shows the previous high of the power demands met was 221 GW in May 2023. Most of the demand rise was witnessed due to exorbitant temperatures during peak summers that led to excessive usage of aircon and desert coolers across the country, another setback of pollution and non-eco-friendly development practices common in developing nations. The government also points out, in a press release, that renewable power sources contributed a significant chunk. Solar power was during the day hours, while Wind power took over during the non-solar hours which is a green sign. 

Through the eyes of the country’s power sector, it is an achievement but, on the flip side, it shows the rising demand and a glimpse of the future where the gencos need to push further to ensure apt generation while the Ministry might be settling terms with allies beyond borders to further increase the coal import.

Considering the status quo, the days ahead will only ask for more. The day is not far when power costs will skyrocket as countries worldwide grapple to get a bigger slice of the pie from the global power capacity.

The probable panacea for a sustainable tech-driven future

Industry trends suggest that data centers and AI solutions are trailing a rising curve. Now, we are all well aware of what that translates to. It is high time we fathom out ways to curb power consumption by AI models while not stunting its growth and development. This calls for industry players to optimise AI models. Some of the ways to optimize and create efficient AI models are:

Model Distillation: It is a technique wherein the knowledge is transferred from a larger complex model to a smaller and more efficient deployable model. The training process in this technique sieves out unrequired knowledge sets and only allows the transfer of focussed datasets. The deployable model hence, consumes less energy to function while delivering effective and desired results. Distillation boosts performance with minimum memory footprint and computational requirements.

Quantisation of AI models: Quantisation is a collective term for multiple ways to reduce power usage, computational requirements, and memory footprint, and enhance performance efficiency. The artificial neural networks are made up of activation nodes, the connection between the nodes, and a weight parameter which is a part of each connection. Quantisation targets these areas. The technique leverages lower-bit quantised data that requires minimal movement on-chip and off-chip. The reduction in movement of data translates to low memory bandwitdth to process a result and hence saves on power usage. This also enables lower-bit mathematical operations with quantised parameters which increases computational efficiency.

Pruning AI Models: While the distillation process enables the creation of smaller and efficient deployable models, Pruning is a technique that helps to cut down noise, overfitting, and unimportant neural connections in the existing models. The technique helps reduce computational costs, enabling faster, and more cost-effective AI inference. 

Power-efficient Hardware: Another way out for enterprises could be to switch to efficient hardware solutions. However, industry players have started coming out with specailised GPUs optimised to run AI models, enterprises can also consider deploying Application-Specific Integrated Circuits (AISCS) and System-on-Chip processors for distributed computing.

Smart Workload Management and Green Data Center Practices: Enterprises can allocate workloads based on energy availability and efficiency metrics. Data Centers can be tuned in to process high-power-consuming workloads during low-demand hours. Also, renewable sources should be used as much as possible to add to the sustainability factor. Further, enterprises should adopt effective cooling solutions and efficient power management systems. The temperature balance in data centers can save power and cut down excessive heat being dispersed into the environment making this a sustainable practice.

Conclusion

AI is here to stay and grow at a faster pace in the times ahead. It has already carved out a space in government operations, business processes, and even in the lives of commoners. Models like ChatGPT, Gemini, and more have democratised AI worldwide. However, looking at the mammoth power demands due to AI adoption needs to be addressed at a war footing note to ensure sustainable development and successful co-existence of man and technology.

Get real time updates directly on you device, subscribe now.

Leave A Reply

Your email address will not be published.

LIVE Webinar

Digitize your HR practice with extensions to success factors

Join us for a virtual meeting on how organizations can use these extensions to not just provide a better experience to its’ employees, but also to significantly improve the efficiency of the HR processes
REGISTER NOW 

Stay updated with News, Trending Stories & Conferences with Express Computer
Follow us on Linkedin
India's Leading e-Governance Summit is here!!! Attend and Know more.
Register Now!
close-image
Attend Webinar & Enhance Your Organisation's Digital Experience.
Register Now
close-image
Enable A Truly Seamless & Secure Workplace.
Register Now
close-image
Attend Inida's Largest BFSI Technology Conclave!
Register Now
close-image
Know how to protect your company in digital era.
Register Now
close-image
Protect Your Critical Assets From Well-Organized Hackers
Register Now
close-image
Find Solutions to Maintain Productivity
Register Now
close-image
Live Webinar : Improve customer experience with Voice Bots
Register Now
close-image
Live Event: Technology Day- Kerala, E- Governance Champions Awards
Register Now
close-image
Virtual Conference : Learn to Automate complex Business Processes
Register Now
close-image