Skip to main content

Command Palette

Search for a command to run...

Markov Chains and Weighted Graphs: Modeling How the World Moves

By: Namish

Published
3 min read

Things like the weather are unpredictable. Predicting the next websites/webpages users enter is unpredictable. There is so much unknown when it comes to some aspects of real life. Yesterday in my AP Physics class, we were learning about Gravitational Attraction, and there was so much unknown, but we found the known in the unknown. It’s the same concept applied here! To understand the unknown, models and systems are used to make the unknown less unknown. Two popular systems that are used are the weighted graphs and Markov chains. Together, they help us represent and analyze systems where outcomes depend on probability rather than certainty.

In computer science, a graph is a data structure made up of nodes, which are often called vertices, and edges that connect them. Nodes can represent cities, web pages, or game states, while edges represent the relationship or connection between them. The Java class my class made to program a graph had a node that held a List that contained the neighbors of that node. 

A weighted graph is similar to a graph, but now each edge has a numerical value attached to it. This value represents some kind of cost, distance, repetition, or probability. For example, in a navigation system, the weight on an edge might represent the distance between two cities. Algorithms can then analyze this graph to find the shortest route between locations. That’s how navigation systems know how long the drivers need to stay on a certain road or tell them when to merge!

Using the idea of graphs, Markov Chains use graphs and Nodes that store values, Strings in particular, to formulate sentences (like an AI)! A Markov chain is a mathematical model used to describe systems based on probabilities. The key idea behind a Markov chain is the Markov property. The Markov Property states that the next state depends on the current state and not on the sequence of states that came before it. Simply said, the system has no memory of what is to come.

Markov chains can be represented visually using weighted directed graphs. Each node represents a possible state, and each edge represents the probability of moving from one state to another. The weights on the edges represent these probabilities, and the outgoing probabilities from a node typically add up to 1.

An example of this is with the weather. There could be three states: sunny, cloudy, and rainy. If one day is sunny, then the chance of it being sunny the next day could be 70%, making it a 20% of it being cloudy, and 10% chance of rain. Each of these probabilities can be represented as weighted edges between nodes.

Another application is text generation. Some AI text models in their early stages used Markov chains to predict the next word in a sentence based on the current word. While modern AI models are far more advanced, the basic concept of predicting future states from probabilities is still a vital part of machine learning.

Markov chains are an interesting intersection between graphs, graph theory, probability, and algorithm design. By representing systems as weighted graphs and analyzing transitions between states, we can model processes from weather forecasts to user behavior online.

In a world full of uncertainty, we found the known in the unknown.

More from this blog