Ensure Voters Get Accurate Information

Voters deserve to know the truth about elections and candidates, but social media has made it possible for bad actors like cyber terrorists to quickly spread lies to a large number of people, including incorrect information about when and how to vote. State lawmakers can help election officials and law enforcement effectively respond to attempts to mislead voters and impose penalties to preserve our representative democracy.

components enacted in

Click below to see legislation

PARTNERS

  • People who want to exercise their right to vote
  • Voting rights advocates
  • Good government advocates
  • Bipartisan election officials

OPPOSITION

  • Interest groups that benefit from low voter participation and reducing election integrity
  • Cyber terrorists

FREQUENTLY ASKED QUESTIONS

In The News

Missouri Independent
States across the country, including Missouri, rush to combat AI threat to elections
Fortune
How election misinformation thrived online during the 2024 presidential race—and will likely continue in the days to come

MODEL POLICY

Protect Voters from Deception Act.

SECTION 1 (TITLE):

This act shall be known as the Protect Voters from Deception Act.

SECTION 2 (PURPOSE):

This Act protects voters and the integrity of elections by prohibiting the knowing dissemination of false information about the time, place, or manner of an election, deceptive digital content that impersonates official election communications, and threats or acts of violence intended to deter lawful voting.

SECTION 3 (PROVISIONS):

a) Definitions.

i) “Communication” means any advertisement, broadcast, publication, social media post, or other form of mass communication, disseminated to the public.

ii) “Deepfake” means materially deceptive audio, visual, or multimedia content that is generated, modified, or created using computers, artificial intelligence, or similar technical means, and that falsely appears to be authentic or truthful. A deepfake includes a depiction of an individual, including a candidate for public office or a public official, appearing to say or do something the individual did not say or do.

iii) “Disseminate” means transmitting content, including digitizations, by any physical or electronic means, including but not limited to social media, email, text message, video-sharing platforms, or publication.

iv) “Materially deceptive” means intentional manipulation in a manner which a reasonable viewer or listener would believe the content depicted is a truthful representation.

b) Prohibited Conduct.

i) A person shall not knowingly disseminate, or enter into any agreement to create or disseminate any of the following:

1) A communication or deepfake that the person knew or should have known contains false information regarding the date, time, or place of an election; the manner of conducting an election; or the qualifications for, or restrictions on, voter eligibility.

2) A communication or deepfake that threatens any act of violence, intimidation, or destruction of property in connection with an election. 

3) A communication containing a deepfake with the intent to injure the reputation of a political candidate or public official, including an election official or mislead a voter about who is or is not a political candidate.

ii) Exemption. The prohibition in subsection (b)(3) of this section shall not apply if the communication contains a disclosure stating in a clear and conspicuous manner that: “This [image, audio, video or multimedia] has been manipulated by technical means and depicts speech or conduct that did NOT occur and:

1) For printed communication, the disclosure text is in bold with a font size of at least 12 point or no smaller than the largest font size otherwise used in the communication.

2) For television or video communication, the disclosure statement must be clearly readable for the average person for the entire duration of the video.

3) For audio communication, the statement is spoken clearly at the beginning and end of the communication, and every 60 seconds if longer than two minutes, in the same pitch, language, and volume as the rest of the content.

4) The metadata of the communication includes the disclosure statement, the tool or software used to create the deepfake, and the date and time of creation. The disclosure statement shall be embedded in the communication’s metadata in a format that is permanent and resistant to removal by subsequent users.

c) Enforcement and penalties.

i) Administrative enforcement. A person may file a complaint with the [Office of the Secretary of State] for a violation of this Act.

1) The [Office] shall hear such complaints in accordance with [citation for existing complaint and administrative hearing procedures] and may impose a penalty of up to [$2,500] per violation. Each instance of dissemination, advertising, broadcasting, or social media posting shall constitute a separate violation. 

2) Upon receiving a report of a communication or deepfake in violation of this Act, the [Office of the Secretary of State] shall: 

(a) Provide accurate and timely information regarding remedial measures to the reporting person;

(b) Issue or direct the issuance of corrective communications via media outlets and online platforms in affected areas; and

(c) Recommend or implement corrective actions, including but not limited to extending polling hours or adjusting public advisories.

3) The [Office of the Secretary of State] may also refer the matter for criminal prosecution.

ii) Civil action.

1) A civil action for injunctive relief may be brought in a court of competent jurisdiction by any of the following:

(a) The Attorney General or a prosecuting authority in the appropriate jurisdiction; or

(b) A registered voter in the state targeted by the communication or the deepfake; or

(c) A person who was falsely depicted or impersonated in the communication or deepfake; or 

(d) An organization representing the interests of voters in [state] targeted by the communication or the deepfake.

2) Relief may be granted to prevent imminent or ongoing violations of this Act, and require the removal, correction, or labeling of the communication or deepfake content. 

3) The court may impose additional civil penalties and award costs, attorney’s fees, and damages to a prevailing plaintiff in an injunctive relief action.

iii) Criminal enforcement. A prosecuting authority in the appropriate jurisdiction may pursue criminal prosecution for violations of this section used in furtherance of a crime. A person who violates this section is guilty of a crime as follows:

1) A first offense shall be a Class 2 misdemeanor, punishable by a fine of up to $2,500.

2) A second offense within three years shall be a Class 1 misdemeanor, punishable by a fine of up to $5,000 and/or imprisonment for up to 12 months.

3) Nothing in this Act shall preclude a prosecuting authority in the appropriate jurisdiction from pursuing criminal prosecution for any other violation of law.

d) SEVERABILITY: The provisions of this Act are severable. If any provision of this Act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.

Related Policies

Effective Government

Save Lives by Stopping Extreme Weather Utility Shutoffs

Personal Freedom

Protect Americans’ Private Health Decisions

Effective Government

Help Workers Save on Taxes & Afford Retirement

WE’VE BEEN PREPARING FOR THIS MOMENT.

Right now, state legislatures are the only place where we can reimagine a better future for all Americans. Fuel our work to build governing power for state lawmakers committed to improving lives:

Pennsylvania state capitol building