Can mean field game equilibria amongst exchangeable agents survive under partial observability of their competitors' states?


BibTeX reference

Classical mean field games (MFG) have been concerned with large games amongst symmetrically influential agents with asymptotically negligible weight. In the absence of a common driving noise, propagation of chaos occurs. The analysis assumes that the initial agent's state probability distribution is known, making its future deterministic and computable via a fixed-point calculation under a limiting equilibrium policy, if it exists. However, oftentimes, despite equal mutual influence, a given agent can only observe a limited number of neighboring agents due to the agent observability structure characterized by an information access graph. This graph may have a low degree even with a large number of agents. The main question addressed is whether an MFG equilibrium can still potentially emerge asymptotically over time. The answer is affirmative, contingent on specific conditions that rely on the stability properties of agents' dynamics and the relative speed of communication to reactions, as derived in this study. The focus is on independent linear scalar agents correlated through a quadratic cost related to the mean state of the agents, which remains unobservable. To tackle convergence to a mean field equilibrium, the proposed model involves a fast communication time scale using a consensus algorithm, alongside a slower agent dynamic time scale. The research explores agents' ability to accurately estimate the system mean as both time and agent numbers increase.

, 11 pages

Research Axis

Research applications


G2354.pdf (400 KB)