Ghosts in the Classroom

How AI Fraud Is Haunting College Campuses & When it Comes to Fraud, A Key Word is Verify

Marley Stevens thought her freshman year at the University of North Georgia would be defined by exams and late-night study sessions. Instead, it was defined by an email. Her professor accused her of cheating—of using artificial intelligence to write a paper. The “proof”? Turnitin’s AI-detection score. But Marley hadn’t used ChatGPT at all. She had used Grammarly, a writing tool that her school itself recommended. Within days, she lost her scholarship, her GPA plummeted, and her anxiety soared. “I felt helpless,” she told USA Today. “I couldn’t sleep or focus on anything.”

While Marley fought to clear her name, a darker form of academic deception was unfolding across America’s community colleges—one that didn’t involve students cheating with AI, but AI itself pretending to be students.

Rise of the “Ghost Students”

According to Fortune and the Associated Press, organized crime rings are now using AI bots to impersonate students, enroll in online classes, and siphon off federal financial aid. In California alone, over 223,000 fake enrollments and $11.1 million in unrecoverable aid were uncovered in 2024. These “ghost students” use stolen or fabricated identities to apply for college, get accepted, and vanish once the aid hits their accounts. They even “attend” class—submitting AI-written homework—to appear real long enough to cash out. 

Need a fake identity for online presence?  Think “sockpuppet,” a fake online persona created by AI apps to deceive others. A typical sockpuppet account often looks complete because it uses assets from readily available places — for example, a social-media profile (e.g., Twitter/X, Instagram), a disposable or free email address, a generic online résumé or profile on a directory site, and an image sourced from stock-photo libraries or AI avatar generators — but those surface signals don’t prove identity. 

Educators and investigators should treat such combinations as red flags rather than proof: look instead for inconsistencies, verify with primary documents or institutional records, and report suspected identity theft to authorities and resources like the FTC or your institution’s security/financial-aid office. 

At one college, 50 fake applications were submitted within two seconds. Another professor thought her class had finally filled to capacity, only to learn that nearly all her “students” were algorithms. The scope is staggering: the U.S. Department of Education estimates $90 million in aid has been stolen, some of it even using the identities of deceased individuals.

Two Sides of the Same AI Coin

These two stories—Marley’s false accusation and the ghost student epidemic—illustrate how AI can both wrongfully convict the innocent and empower the guilty.

Forensic accountants, fraud investigators, and higher-education administrators now face a dual challenge: protecting honest students from false AI-detection flags and identifying fraudsters who weaponize AI to commit large-scale financial crimes against colleges and taxpayers.

What Forensic Accounting Can Teach Us

From a forensic perspective, these schemes leave behind digital fingerprints—patterns of addresses, email domains, and submission timing that betray nonhuman coordination. Data analytics tools can reveal and verify clusters of fraudulent activity by comparing multiple FAFSA applications from the same IP address or phone number, course submissions with identical AI-generated language patterns, and login timestamps that occur at impossible intervals. This is modern forensic work: not chasing paper trails, but algorithmic trails.3

Human Cost Behind the Data

Victims like Heather Brady of San Francisco discovered $9,000 in loans taken out in her name for classes she never attended. Brittnee Nelson, a Louisiana small-business owner, spent two years untangling fake student loans that nearly went to collections. Meanwhile, students like Marley Stevens lose mental health, scholarships, and trust in the academic system—all because of AI systems gone awry.

Lessons for Fraud Students: What to Watch For

In your fraud and forensic accounting studies, these cases exemplify:

1. Identity Theft and Synthetic Fraud when identities (real or fabricated) are used to extract benefits from institutions. Verify the application details.
2. Control Weaknesses, especially in open-admission or automated systems where identity verification is weak.
3. Data Analytics for Detection using outlier analysis, Benford’s Law and duplicate-address detection to reveal clusters of fraud.
4. Ethical Oversight— the importance of human judgment in AI-assisted decisions.

Every forensic accountant must now be both a data analyst and, to a certain extent, an ethicist, at least because suspicion should be attached to everything. 

Call to Action

The U.S. Department of Education is now requiring government-issued ID verification for first-time aid applicants, but the arms race continues. As future fraud examiners, students should see this as a case study in real-world digital deception—one that merges psychology, cybersecurity, and forensic finance. Whether it’s an innocent student accused of cheating or a fake one stealing financial aid, AI has blurred the line between human and machine fraud. And on today’s college campuses, sometimes the most dangerous ghosts aren’t haunting the dorms—they’re haunting the databases.2

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *