How DeepMind teaches the AI to play soccer.
DeepMind, Google’s AI research arm, announced a new system that allows it to control itself with real data inputs. According to the team, the system is called Neural Motor Probability (NPMP), a method that any artificial intelligence uses to learn and simulate motion based on real-world data.
The team conducted the experiment in a two-player soccer game (2 vs. 2). NPMP allows dummies to recreate real player actions such as running, aiming, hitting, dribbling, and shooting in a simulation environment. In the end, the team that scores the first goal wins.
The new model is considered superior to AIS when playing Go or chess. Instead of solving on its own, the new AI must learn the steps itself and then work together seamlessly to reach the final goal.
Another network DeepMind’s model is still in its infancy, but it has great potential over time, especially by combining multiple AIs to achieve faster and better results. “It will be interesting to see this AI move beyond the lab environment to real-world robotic applications,” the website commented.