- More jobs from the concern property are present nether threat, acknowledgment to the advancement successful generative Artificial Intelligence (Gen AI)
- The International Labour Organisation has conducted a survey to measure the interaction of generative AI connected occupation roles crossed antithetic countries
- The study reveals the jobs that are astir susceptible to being taken implicit by generative AI, and recommends enactment points for the governments of the countries
Legit.ng writer Ruth Okwumbu-Imafidon has implicit a decennary of acquisition successful business reporting crossed integer and mainstream media.
As generative artificial quality (AI) rapidly advances worldwide, a caller study by the International Labour Organisation (ILO) and Poland’s National Research Institute (NASK) warns that up to 25% of jobs globally look imaginable disruption.
The report, titled “Generative AI and Jobs: A Refined Global Index of Occupational Exposure”, released connected May 20, 2025, estimates however generative AI technologies could interaction the labour marketplace crossed countries.

Source: Getty Images
The study puts the planetary mean of jobs astatine hazard astatine 25%, but notes that high-income countries person a higher hazard of astir 34% of jobs, arsenic they person a higher integer integration compared to others.
Platforms similar Google person incorporated more AI features to marque hunt results much applicable with quality context.
Women's occupation roles are astatine higher risk
The ILO study besides estimates that women look a higher hazard of losing their jobs to Generative AI, and successful high-income countries, 9.6% of pistillate workers could suffer their jobs, arsenic opposed to antheral workers, wherever they task 3.5% of jobs to beryllium mislaid to AI.
The study besides identifies immoderate of the astir susceptible jobs to see clerical roles, jobs successful bundle development, concern and media.
However, afloat automation remains limited, arsenic galore roles inactive necessitate quality judgement and collaboration to beryllium effective.
ILO study provides insights into jobs to beryllium mislaid to AI
Providing insights into the study, Lead writer Pawel Gmyrek, ILO Senior Researcher, said that they combined adept reviews, quality insights and generative AI models to travel up with a replicable method for countries to measure their hazard and program a effect strategy.
Gmyrek described the extremity merchandise arsenic a instrumentality grounded successful real-world jobs, with an “occupational vulnerability index” that details the interaction of generative AI connected antithetic occupation roles crossed antithetic countries.
Senior Economist Janine Berg added that it is simply a instrumentality that the governments of antithetic countries could usage to hole their labour marketplace for the integer future, the SUN News reports.
Berg explained that the vulnerability request not construe to occupation losses if the governments deploy the due policies, integer infrastructure and inclusive workforce skills to mean the disruptions.
For instance, successful Nigeria, tech elephantine Microsoft has shared plans to invest $1 cardinal (about N1.6 billion) to bid 1 cardinal Nigerians successful artificial quality (AI).

Source: Getty Images
Generative AI competes for marketplace dominance
Since ChatGPT burst onto the country successful precocious 2022, generative artificial quality (GenAI) models person been vying to instrumentality the lead, with the US and China striving to nutrient the champion AI assistant.
Legit.ng reported that Chat-GPT became the archetypal to marque generative AI freely disposable to people arsenic a dedicated application, adjacent though respective existed earlier then.
These Gen AI are fashionable for their abilities to make originative works similar images, videos and written works, skills antecedently reserved for humans.
PAY ATTENTION: Сheck retired quality that is picked exactly for YOU ➡️ find the “Recommended for you” artifact connected the location leafage and enjoy!
Source: Legit.ng