Skip to page content

UIUC Profs Use Illinois Supercomputer to Take on Partisan Gerrymandering



Though the approval rating for Congress is at just 18 percent, the re-election rate is approximately 95 percent. Why do politicians with low approval ratings continue to get re-elected?

Many point to partisan gerrymandering, drawing legislative district maps that discriminate against a political party to benefit another. Though both parties and many voters oppose the practice, courts have struggled to address gerrymandering, in part because it can be difficult to evaluate whether maps have been drawn with partisanship as the main motive.

That's where University of Illinois Urbana-Champaign researchers Wendy K. Tam Cho and Yan Liu come in. Using the Blue Waters supercomputer at Illinois' National Center for Supercomputing Applications (NCSA), they've generated 800 million voter district maps that could be used as an objective way to measure the fairness of a legislative map.

Cho, who's a professor of political science as well as a senior research scientist at NCSA, answered a few questions over email. In this interview, she discusses how their solution could stop partisan gerrymandering, and how supercomputing can improve democracy.

Chicago Inno: In discussing this research you point out that gerrymandering allows politicians to "pick their voters.” How does this happen and what are the consequences of creating partisan legislative districts?

Wendy K. Tam Cho: Redistricting is mandated by the US Constitution to occur every 10 years following the decennial census. In more than two-thirds of the states, the majority party of the state legislature assumes the primary role in creating a redistricting plan. These are obviously self-interested actors sitting at the helm of a crucially important process.

Moreover, state legislatures, with rare exceptions, enjoy wide latitude in how to draw district lines. The result is a redistricting process that is easily manipulated. Future election results are essentially known even before votes are cast when the district composition is highly skewed toward one party or the other. In these gerrymandered districts where incumbent politicians essentially hand-pick their voters, elections are not competitive. It is not, as we wish to assume in a democracy, the voters choosing their representatives.

Your solution is to create over 800 million potential voting maps using the Blue Waters supercomputer. Explain how this could help legislators and the judicial branch assess the unfairness of a voting map. 

In a partisan gerrymandering case, one needs to assess how partisanship factored into the process.  This is difficult since it is hard to untangle what went on in a human’s mind while drawing districts.  With a computer, it is not difficult to ascertain which factors went into mapmaking.  It is straightforward to instruct a computer to draw districts that are equipopulous, contiguous, keep cities and counties together, or any other criteria, while not using any partisan information whatsoever.  Every jurisdiction is unique and its maps are constrained by its population (how many Democrats and Republicans), how Democrats and Republicans settle in relation to one another, where the rivers and ocean flow, how the mountains carve up the state, how the cities have developed, the shape of the counties, the racial and/or socio-economic concentrations that have formed over the course of the state’s history, etc. 

With a computer, it is not difficult to ascertain which factors went into mapmaking.

We take these constraints into consideration and draw, say, 800 million maps, with these constraints in place but not using any partisan information (or perhaps using it in a limited capacity).  Once we have this large set of maps, we can see how the non-partisan constraints affect the maps.  We can also then compare the map under dispute with our comparison set to see whether the current map is an outlier on partisan outcomes.  By comparing the current map to our large set of maps, we can assess how potential partisan considerations have changed the outcome and then perhaps affects the fairness of the map.

Can you give an example of where a tool like this would have been helpful in fighting partisan gerrymandering?

The Supreme Court has been trying for decades to settle on a standard for, or a definition of, political fairness, but has found this task sufficiently difficult that it has come very close to declaring partisan gerrymandering to fall into the realm of non-justiciability.  It has not gone quite this far and has left open the idea of a workable standard.  A main shortcoming is that while the Court wishes to limit partisan gerrymanders, it does not have a way to assess how much partisanship, versus other legal criteria, has factored into the process.  Partisan information is allowed in the redistricting process, but its use cannot be excessive.  The Court needs a way to separate natural consequences arising from particular population concentrations from partisan attempts to bestow an unnecessary political advantage.

Our computational tool provides a way to measure the level of partisanship exhibited by any particular map, allowing the Court to assess whether a particular electoral map was created with partisanship as the predominant motive or as simply a relatively small motivation amongst many other legal criteria.

What are your next steps with the tech and research?

Our next steps are to further refine the algorithm so that it can be used in a wide variety of different scenarios.  Different states and locales may have different criteria that must be satisfied for their jurisdiction.  So, that would involve coding in some additional criteria that might be relevant in different states.

You have used supercomputers in the past to do research in political science. How did you start to work with the supercomputer, and what opportunities does supercomputing offer in terms of political science research?

Both my co-author, Yan Liu, and I have been coding for about 30 years.  Both of our interests in computation are long standing.  This project is not simply about computing, but is the culmination and confluence of many interests.  Yan has conducted research in the past on spatial models, which was enormously helpful for designing the spatial components of our evolutionary algorithm.  I have written extensively on redistricting, minority politics, and the Voting Right Act.

We have both been very aware of how the power of information and computing has manifested its extensive and often surprising reach into many realms of life.  Our capacities to compile, organize, analyze, and disseminate information has increased dramatically and facilitated the creation of many tools to connect citizens and automate human tasks.  We can use these very same capacities to improve society.  They can shed insight into our governance structures, ideally enabling us to improve our democratic society.


Keep Digging

News
News
Cannect Wellness founding team
News
News
News


SpotlightMore

See More
Chicago Inno Startups to Watch 2022
See More
See More
2021 Fire Awards
See More

Want to stay ahead of who & what is next? Sent twice-a-week, the Beat is your definitive look at Chicago’s innovation economy, offering news, analysis & more on the people, companies & ideas driving your Chicago forward. Follow the Beat

Sign Up