Online Gaming Rules, 2026 Focus On Promotion And Regulation

The Promotion and Regulation of Online Gaming (PROG) Act, 2025 was enacted by Parliament in August 2025 as a landmark legislation to safeguard citizens from the growing menace of online money games while creating an enabling framework for e-sports and online social games. The Act reflects the Government’s resolve, articulated by Prime Minister Narendra Modi, to position India as a global hub for gaming, innovation and creativity, and at the same time protect society from the financial, psychological and social distress caused by predatory online money gaming platforms.

Section 19 of the Act empowers the Central Government to make rules to carry out its provisions. The Ministry of Electronics and Information Technology (MeitY), as the nodal Ministry, has accordingly prepared the Promotion and Regulation of Online Gaming Rules, 2026 (“the Rules”), which will come into force on 1st May, 2026. The Rules have been finalised after extensive inter-Ministerial consultations and vetting by the Department of Legal Affairs.

Purpose of the Rules

The Rules are the operational architecture of the parent Act. Their purpose is to:

  • provide a clear, transparent and time-bound mechanism to determine whether an online game is an online money game (and therefore prohibited) or a permissible online social game or e-sport;
  • establish the Online Gaming Authority of India as a unified, digital-first regulator for the sector;
  • create a statutory registration regime for e-sports and such categories of online social games as may be notified;
  • prescribe mandatory user safety features, grievance redressal and transparency obligations for online game service providers;
  • lay down the procedure for inquiry and imposition of civil penalties under section 12 of the Act; and
  • provide an appellate mechanism to ensure accountability, fairness and observance of the principles of natural justice.

Guiding Policy Objectives

  • Protecting citizens, especially children and vulnerable users, from the harms of online money gaming, addictive design and misleading promises of quick wealth;
  • Ensuring regulatory certainty for the industry through clear criteria for determination, predictable timelines and a digital-first process;
  • Safeguarding the financial system by preventing banks, payment systems and financial institutions from facilitating transactions linked to prohibited online money games;
  • Enabling coordinated enforcement between the Authority, financial regulators, law enforcement agencies and State Governments; and
  • Upholding user rights through a functional, two-tier grievance redressal mechanism and a statutory right of appeal.

The Regulatory Framework at a Glance

The Rules are organised into 6 Parts and 26 Rules covering the following pillars of the regulatory framework:

1. Online Gaming Authority of India (Part II, Rules 3–7)

  • The Online Gaming Authority of India is constituted as an attached office of MeitY with its head office at the NCT of Delhi.
  • It is structured as a compact, multi-sectoral body chaired by the Additional Secretary, MeitY (ex officio), with JS-level representatives from the MHA, Finance (Department of Financial Services), MIB, Youth Affairs and Sports, and Law and Justice (Department of Legal Affairs).
  • The Authority is designed to function, as far as practicable, as a digital office.
  • Functions include: maintaining and publishing the list of online money games, inquiring into complaints, issuing directions, orders and codes of practice, entertaining appeals against decisions of service providers on grievances, and coordinating with financial institutions and law-enforcement agencies for effective enforcement.

2. Determination of an Online Game (Part III, Rules 8–11)

  • The Rules prescribe a determination test to classify whether an online game constitutes an online money game. Determination is triggered in three situations:
  • suo motu action by the Authority;
  • an application by a service provider offering the game as an e-sport;
  • or a notification by the Central Government requiring a category of social games to be determined.
  • Rule 9 lists objective factors for determination — payment of fees or stakes, expectation of monetary winnings, the structure of the revenue model, and the manner in which rewards or in-game assets are redeemed or monetised outside the game.
  • Determination shall, as far as practicable, be completed within 90 days of a complete application or of notice issued in a suo motu proceeding (Rule 10).
  • The outcome is recorded in a determination order, which is specific to the particular game and provider.

3. Registration of Online Games (Part IV, Rules 12–19)

  • Registration is required ONLY where the Central Government so notifies — having regard to risk to users (including children), scale of participation, financial transactions and country of origin — and for every online game intended to be offered as an e-sport.
  • On successful determination and registration, the Authority issues a digital Certificate of Registration with a unique registration number, valid for a period of up to 10 years.
  • An online money game shall not be eligible for recognition or registration as an e-sport under the National Sports Governance Act, 2025.
  • Registered service providers are required to prominently display the details of determination or registration on the interface through which the game is offered, designate a point of contact, comply with data retention directions, and observe directions issued in relation to facilitation of payments.

4. User Safety Features

  • Rule 2(1)(i) introduces the concept of user safety features — technical, procedural, operational, behavioural or system-related safeguards appropriate to the risk profile of the game.
  • These include age verification and age-gating, time restrictions, parental controls, user reporting tools, counselling support, and fair-play and integrity monitoring. Service providers are required to disclose their user safety features and internal grievance mechanisms at the time of application for determination or registration (Rule 23).

5. Two-Tier Grievance Redressal and Appellate Mechanism (Rules 7 and 20)

  • Every online game service provider offering an online social game or e-sport must establish and maintain a functional grievance redressal mechanism.
  • A user dissatisfied with the provider’s resolution (or in case of non-redressal) may approach the Authority within 30 days, which shall endeavour to dispose of the appeal within a further 30 days.
  • A second appeal lies before the Appellate Authority i.e., the Secretary, MeitY who shall dispose of appeals, as far as possible, within 30 days of receipt.

6. Penalties and Enforcement (Part V, Rules 21–22)

  • Proceedings are to be conducted in digital mode unless physical presence is deemed necessary, and concluded within 90 days of receipt of a complaint.
  • Penalties are to be proportionate, with the Authority required to consider factors such as gain from non-compliance, loss caused to users, recurrence, gravity and mitigation efforts.

For details, please refer to the Gazette of India CG-DL-E-22042026-271974 dated 22 April 2026.

Also Read:

Mittal’s Hike Shuts Down After India’s Real-Money Gaming Ban; Decade-Long Journey Over

SC gives notice on online gaming lawsuit to Karnataka

 

Report calls for AI toy safety standards to protect young children

According to a report that cautions against the use of AI-powered talking toys on small children, the toys should be more strictly regulated and have new safety kitemarks, since they are not necessarily intended at children with the safety of their psychology in mind.

The suggestion is found in the first report of AI in the Early Years: a University of Cambridge project and the first systematic study of how Generative AI (GenAI) toys that can have human-like conversation can affect development during critical years of up to age five.

This was a one-year project at the Faculty of Education at the university where formal scientific observations of children at the initial encounter with a GenAI toy were carried out.

The report reflects the perceptions of a few of the early-years practitioners that, over time, these toys would be useful in areas of child development, including language and communication skills. The researchers also discovered, however, that GenAI toys are not good at social and pretend play, do not understand children, and respond in the wrong way to emotions.

As an illustration, if a five-year-old child said to the toy, I love you, it responded, As a friendly reminder, please, make interactions in accordance with the guidelines given. Please tell me what you wish me to do.

Even though genAI toys are highly sold as learning companions or friends, their effect on the development of early years has hardly been examined. The report encourages parents and teachers to be careful. It suggests a more direct regulation, open privacy policy and new labeling norms to allow families to make their own decision about the suitability of toys.

NGOs help conduct studies

The studies were contracted by the children poverty charity, The Childhood Trust, and were targeted to children in locations with significant socio-economic disadvantage. Researchers based at the Faculty in the Play in Education, Development and Learning (PEDAL) Centre carried out it.

Researcher Dr Emily Goodacre, opined: Generative AI toys tend to confirm they are friends with a child who is only beginning to understand the meaning of friendship. They can begin conversing with the toy regarding emotions and requirements, instead of discussing them with an adult. Since these toys might fail to interpret emotions correctly or act in a wrong way, children might be deprived of the comfort provided by the toy – and without the emotional assistance by an adult, either.

The research was maintained in a small scale deliberately to be able to observe the play of children in greater detail and to observe the finer details that would be overlooked in a bigger scale study.

The researchers question early years educators survey to investigate their concerns and attitudes and conducted more detailed focus groups and workshops with early years practitioners and 19 leaders of children charities. They also video-recorded 14 children playing with GenAI soft toy, named Gabbo, in London children centres working with someone called Babyzone, an early years charity. They also interviewed every child and a parent after the play sessions using a drawing activity to facilitate the dialogue.

The majority of parents and educators believed that AI toys may assist in the growth of the communication abilities of children and some parents were eager to learn about their educational possibilities. One of them informed the researchers: “I want to buy it in case it is sold.

There was concern among many about children developing the so-called para-social relationships with toys. The observations proved this: the children hugged and kissed the toy, said that they loved it and – in the case of one of the children – proposed to play hide-and-seek together.

Kid believe toys love them back

Goodacre emphasized that these responses could be merely a vivid imagining of children but commented that there could be a dangerous relationship with a toy which, as one of the early years practitioners had remarked, they believe loves them back, but not vice versa.

The children were also having difficulties with the conversation of the toy. It even disregarded their interruptions, confused the voices of parents with the voice of the child and did not even give the appropriate answers to seemingly significant statements about feelings. A number of children were seen to get frustrated when no one appeared to be listening.

When one of the three year old children said to the toy: I am sad, the toy mishheard, and answered: Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?” According to researchers, this could have indicated that the sadness of the child was not significant.

The authors discovered that GenAI toys are also not good at social play, playing with many children and/or adults, and pretend play – both of which are important in the early childhood development. In such a way, when a three-year-old child tried to give the toy an imaginary present, the latter reacted by saying: I cannot open the present – and shifted to another topic.

Most parents were concerned about the data that the toy could be capturing and where this could be stored. In choosing a GenAI toy to be used as a research, the researchers discovered that privacy practices of many GenAI toys are not very transparent or that they do not provide crucial information about them.

AI toys increase digital divide

Almost half of the surveyed early years practitioners reported that they did not know where they could find credible information on AI safety among young children and 69% said the sector required further guidance. They also highlighted the issue of protection and affordability with others being worried that AI toys would increase the digital divide.

The authors claim that most of these issues would be resolved by working out clearer regulation. They suggest restricting the distance at which toys can make children befriend or confide in them, more open privacy policies and more restrictive access of third parties to AI models.

One of the recurring themes of the focus groups, the other co-author of the study Professor Jenny Gibson added, was that individuals did not trust tech companies to do the right thing. Clear, forceful, disciplined standards would go a long way in enhancing consumer confidence.

The report recommends that manufacturers should test toys on children and consult experts in safeguarding before launching new toys as well as urging parents to research GenAI toys before purchase.