Ballotpedia's Polling Index: Presidential approval rating

From Ballotpedia
Jump to: navigation, search

Ballotpedia's presidential approval polling average: 43% (May 5, 2017)

Results are updated daily at 9:30 a.m. EST and aggregated from the most recent polls from the sources listed in the methodology section below. Think we're missing something? Email us.

The presidential approval rating indicates public satisfaction in the job performance of the president of the United States. It is the percentage of people polled who approve or think favorably of the president.

[edit]

Daily average ratings

Daily presidential approval rating average
Date Average approval rating Change
May 5, 2017 43 percent -
May 4, 2017 43 percent -
May 3, 2017 43 percent -
May 2, 2017 43 percent -
May 1, 2017 43 percent ▲ 1
April 28, 2017 42 percent -
April 27, 2017 42 percent -
April 26, 2017 42 percent -
April 25, 2017 42 percent -
April 24, 2017 42 percent ▼ 1
April 21, 2017 43 percent ▲ 1
April 20, 2017 42 percent ▲ 1
April 19, 2017 41 percent -
April 18, 2017 41 percent ▼ 1
April 17, 2017 42 percent ▲ 1
April 14, 2017 41 percent -
April 13, 2017 41 percent -
April 12, 2017 41 percent -
April 11, 2017 41 percent -
April 10, 2017 41 percent -
April 7, 2017 41 percent -
April 6, 2017 41 percent ▲ 1
April 5, 2017 40 percent ▼ 1
April 4, 2017 41 percent ▼ 1[1]
April 3, 2017 42 percent -
March 31, 2017 42 percent -
March 30, 2017 42 percent -
March 29, 2017 42 percent -
March 28, 2017 42 percent ▼ 2[2]
March 27, 2017 44 percent -
March 24, 2017 44 percent -
March 23, 2017 44 percent ▼ 1
March 22, 2017 45 percent -
March 21, 2017 45 percent ▲ 1
March 20, 2017 44 percent -
March 17, 2017 44 percent -
March 16, 2017 44 percent -
March 15, 2017 44 percent ▼ 1
March 14, 2017 45 percent -
March 13, 2017 45 percent -
March 10, 2017 45 percent -
March 9, 2017 45 percent -
March 8, 2017 45 percent -
March 7, 2017 45 percent -
March 6, 2017 45 percent ▲ 1[3]
March 3, 2017 44 percent -
March 2, 2017 44 percent -
March 1, 2017 44 percent -
February 28, 2017 44 percent -
February 27, 2017 44 percent -
February 24, 2017 44 percent -
February 23, 2017 44 percent -
February 22, 2017 44 percent -
February 21, 2017 44 percent ▼ 1
February 20, 2017 45 percent -
February 17, 2017 45 percent -
February 16, 2017 45 percent -
February 15, 2017 45 percent ▲ 1
February 14, 2017 44 percent -
February 13, 2017 44 percent ▼ 1
February 10, 2017 45 percent -
February 9, 2017 45 percent -
February 8, 2017 45 percent ▲ 1
February 7, 2017 44 percent -
February 6, 2017 44 percent ▼ 1
February 3, 2017 45 percent -
February 2, 2017 45 percent ▲ 2
February 1, 2017 43 percent ▼ 1
January 31, 2017 44 percent -
January 30, 2017 44 percent ▼ 1
January 27, 2017 45 percent ▼ 2
January 26, 2017 47 percent ▲ 2
January 25, 2017 45 percent -

All results

About these numbers

The numbers above are averages taken from several polls. But how did those polls arrive at those numbers? The simplest answer is to say that a company or organization contacted a group of adults, asked them some questions, and then reported how they responded. Polling, however, is a science, and once you take a look beneath the hood of a poll, things become much more complicated.

Below we briefly highlight three aspects of public polling that illustrate both the complexity of polling and how polls tend to differ from one another. Understanding these concepts is key to interpreting what polls mean and underscores the value of polling averages.

Contact method

Pollsters use a variety of different methods to contact potential survey participants. From the 1930s to the 1980s, pollsters generally did their work through direct contact: going door-to-door, a remarkably expensive and time-consuming method.[4] Nowadays, pollsters rely upon telephones and the internet. Neither of these approaches comes without challenges. Fewer Americans today, for example, live in households with landlines than they did 20 or even 10 years ago. On the other hand, not every American—particularly in older generations—has a cell phone. To get around this, many pollsters call a combination of landlines and cellphones for a survey. An additional problem is that, with the rise of caller-ID, fewer people pick up the phone to participate in surveys—part of a systemic problem in the modern polling industry known as the response rate. Some pollsters have to looked to the internet as a workaround for this issue, but analysts continue to debate the accuracy and dependability of online polls.[5][6]

There are also differences among polling firms in who contacts the participants. Some phone-based surveys use live-interviewers, while others use automated interactive voice responders.[6] Within the polling community, there has been significant debate over the merits of all of these approaches.

Question framing

Though all polling firms, in general, are after the same goal—to find out what the public thinks about a given topic or issue—they don’t always ask their questions the same way. Studies have found that differences in how questions are worded—even subtle differences—can lead to a range of results. In 2003, for example, Pew Research found that when they asked respondents if they “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule,” a total of 68 percent responded that they favor military action. But when Pew added to the end of that question, “... even if it meant that U.S. forces might suffer thousands of casualties,” 43 percent responded in favor of military action.[7]

The number of possible answers that pollsters provide to respondents has also been known to produce different results. With questions about presidential approval and disapproval, for instance, some firms only give respondents the options of saying approve or disapprove. Other firms, however, give respondents more flexibility by allowing them to respond with answers such as “strongly approve” or “somewhat disapprove.” Again, these slight differences have historically led to differing results among polling firms.[8]

The sample

Pollsters can’t realistically contact every American adult throughout the country and ask their opinion on a given issue. Instead, they try to contact a representative sample—usually anywhere between 500 and 1,500 individuals—that accurately represents the country’s population as a whole. Pollsters, with the help of statisticians, demographers, and data experts, use a variety of techniques to create a representative sample. This typically involves using complicated probability formulas and algorithms to ensure random sampling and to increase the likelihood of contacting an accurate cross-section of the U.S. adult population. Some pollsters also create panels of respondents that they believe reflect the actual population and poll them repeatedly over a span of time. These polls are usually called tracking polls. Oftentimes, pollsters weigh their respondents to account for various demographic measurements. For example, a pollster might weigh more heavily the responses from a specific demographic group if that group was poorly represented in the random sample in relation to the country’s estimated demographic composition. The same might be done if a group appears to be overrepresented.

Samples are also where margins of error (MoE) come into play. The MoE describes the potential range of variation for a poll’s results in the context of its representative sample and the actual population. For example, if a poll with a margin of error of 3 percentage points showed that 47 percent of respondents approve of candidate X, that means the pollster believes, based on the representative sample in the poll, anywhere between 44 and 50 percent of the actual population approves of candidate X. Generally speaking, a larger sample size means a smaller MoE, while a smaller sample size means a larger MoE. Other factors such as the poll’s design, probability formulas, and weighting methods can also affect MoE.[9][10]

For questions on polls and methodology, email: [email protected].

For Ballotpedia's presidential approval, congressional approval, and direction of the country polling results, we take an average of the most recent polls on one or more of these topics conducted by 12 sources. Polls may be included in the average for up to 30 days, though this timeline may be adjusted to account for major news events as we attempt balance the need for a larger sample of results with the need to remove outdated information. For a full description of our methodology and polling explanations, see: Pliny's Point polling methodology.

Typical poll questions asked either online or by phone include:

  • "Do you approve or disapprove of the way __ is handling his job as President?"[11]
  • "Do you approve or disapprove of the way Barack Obama has handled his job as president?"[12]
  • "How do you think __ will go down in history as a president?"[13]
  • Rasmussen Reports allows respondents more than two options, including Strongly Approve, Somewhat Approve, Somewhat Disapprove, and Strongly Disapprove in their daily Presidential Tracking Poll.[14]


Click the Data tab above for more information.
First presidential approval rating - Gallup
President Approval rating from first Gallup poll Poll end date
Barack Obama 68 percent January 23, 2009
George W. Bush 57 percent February 4, 2001
Bill Clinton 58 percent January 26, 1993
George H.W. Bush 51 percent January 26, 1989
Ronald Reagan 51 percent February 2, 1981

[15][16]

Presidential approval ratings can vary significantly throughout a president's term. For example, in the wake of the 9/11 terrorist attacks during September 2001, President Bush's approval rating, based on Gallup polls, hit an all-time high at 90 percent. It later dropped down to 25 percent in October 2008. Overall average approval ratings from Gallup for previous presidents going back to Harry Truman range from 45 percent to 70 percent.[17]


See also

Ballotpedia daily polling averages:

Stay in the know:

Footnotes