When the digital revolution first hit classrooms in the early 2010s—and more schools began issuing laptops or tablets to students for individual use—online learning seemed to promise a faster, easier way to facilitate learning. All of Shakespeare’s plays, or an algebra textbook, were suddenly at students’ fingertips, to be carried around with them; engaging games were poised to make practicing phonics or multiplication tables more fun and more efficient. A generation of digital natives would effortlessly connect with digital learning tools, taking to them like ducks to water. Learning could happen anytime and anywhere—and therefore would happen.
“It allows us to extend the classroom beyond these four walls,” gushed a teacher in a 2011 New York Times article, referring to his school’s rollout of a one-to-one iPad initiative.
In the ensuing years—and surging especially in 2020, when most schools closed for some period of time due to the Covid-19 pandemic—nearly all of schooling moved online. Surveys estimate that 94 percent of K–12 students now have their own school-issued device. Digital learning platforms, especially those focused on reading and math, were already ubiquitous when Covid shuttered schools. IXL, one of the largest, now claims that “95 of the top 100” school districts use the platform in classrooms.
Digitizing math practice made particular sense. Online games and other kinds of practice problems seemed ideal for building foundational math skills, and early research seemed to confirm their potential to help students achieve. What platforms like IXL, i-Ready, Zearn, DreamBox, and others provided was similar to traditional worksheets, only “smarter”—with digital hints to help students who were stuck, engaging games and images, and algorithms that could customize the student experience and predict what kind of problems an individual learner needed next.
Yet experts and educators now wonder whether these digital platforms are living up to their promise of an inevitable increase in learning and achievement. After nearly a decade of digital learning, math achievement in the United States hasn’t improved. In fact, in many areas of the country, it has declined.
Millions of students have access to the most popular digital platforms in school and at home each day. Providers proudly share evidence that consistent use of their platform improves academic performance; yet closer inspection of the evidence they tout reveals that very few students use the platforms the way they’re intended.
If digital platforms make it easier and more efficient for students to practice, if more practice is a sure path to higher achievement, and if most districts have made the platforms available to students, then why aren’t more students using them?
Which Students Benefit the Most?
Among global peers, America’s usually middling math performance has been slowly declining. On the Program for International Assessment, or PISA exam, American 15-year-olds performed 18 points lower in math in 2022 than they did two decades earlier. Meanwhile, on the National Assessment of Educational Progress, the group of students who fall into the lowest achievement category, “below basic,” is growing. In 2019, 19 percent of 4th graders scored below basic; in 2022, that number rose to 25 percent. For 8th graders, the share scoring below basic grew from 31 percent in 2019 to 38 percent in 2022.
Research confirms that skill practice is essential for math proficiency. Yet educators and experts say most students today don’t get enough time to practice, often due to increased classroom attention to developing conceptual understanding and the pressure on teachers to devote equal attention to more than two dozen grade-level standards. Teachers say this pressure often leaves little room to solidify what’s been learned.
“It’s a huge problem,” Sarah Powell, associate professor at the University of Texas at Austin, told Edutopia last year. “We have trapped ourselves with having so many expectations; at each grade level there are these standards, and you are supposed to cover these 30 things and not let anything go,” she said.
XQ Institute senior adviser Laurence Holt said digital math platforms can play a critical role by helping students get more practice on key math skills, even when they’re not in class. These platforms give teachers detailed reports, often broken down by skill, that reveal students’ gaps and feature algorithm-based guidance on where individual students need the most help.
“Parents understand that if you want to get good at a sport or a musical instrument, it takes practice. So when they hear that their student is behind in math, where’s the practice?” Holt said in an interview. “The wonderful news is classrooms now have free access for all your kiddos on this amazing, clever platform.”
Students who practice consistently on digital platforms like Khan Academy and Zearn show impressive increases in math achievement, according to the platforms’ own research. But when Holt took a closer look at the data, he found those success stories were often limited to a tiny number of students—in some cases, about 5 percent.
A large study of 99 school districts published by Khan Academy in 2022 showed “an effect size of 0.26 standard deviations (SD)—equivalent to several months of additional schooling—for students who used the program as recommended,” Holt wrote.
The magic words, he stressed, were used the program as recommended. According to Khan Academy, only 4.7 percent of students were getting the recommended dosage of online math practice of 30 minutes per week.
A similar study of students who used Zearn, a game-based math platform for students in kindergarten through 8th grade, found that students who used the platform as intended were most often white or Asian, were more likely to be from high-income neighborhoods, and were less likely to be considered at risk. Findings like these suggest that the students benefiting the most from the availability of digital math practice are the ones who likely need it the least.
Digital platforms would have the potential to help students get more math practice, plus crucial in-the-moment feedback, if more students were inclined to use them regularly, Holt said. “Research has shown that the learning rate for all students is essentially the same, provided you get support, which could just be feedback. All of these automated systems already do it. If you have that, then you’ll make gains. It’s just time on task.” But right now too few students are spending enough time on task to make much of a difference.

Why Do Digital Platforms Go Unused?
One reason student use may be low is that a lot of teachers aren’t crazy about the digital tools their district has adopted. According to a 2023 Education Week survey, half of teachers rated their math tech tools as “good,” while one in three said they’re “mediocre or poor.” A casual social media survey of educators elicited a wide variety of attitudes toward computerized practice—some worried about screentime, others about cheating, and still others said digital platforms are more suited to students who are already motivated—an idea supported by some research. “IXL is quite good for the motivated student to whom you can say, ‘Go do this,’ and they go do it,” one teacher wrote.
Others say the material on digital learning platforms doesn’t match their core in-class curriculum. “Teachers hate it and students are lost,” said one Texas math teacher who requested her name be withheld. Her district adopted both i-Ready and the Texas Mathematics Toolbox, a compilation of resources, strategies, and tools, but the lessons in the two systems often don’t match up, and practice problems aren’t consistent between lessons, the teacher said.
Dylan Kane teaches 7th-grade math in a small mountain district in Colorado, one of the lowest performing in the state. Kane said he “hasn’t touched” the ALEKS digital platform his school purchased and favors designing his own practice problems.

ALEKS “operates in a paradigm that’s the opposite way of how I think about practice,” he said. The digital platform can’t mix up problems from multiple units, an evidence-based practice called interleaving that’s been shown to help students learn more deeply by asking them to make connections across concepts.
“Most platforms do a bunch of repetition and then once you master that skill, the student moves on and doesn’t see that skill in the future. I basically do the opposite of that—I design short chunks with feedback. The key skills for 7th grade come up over and over again. They are going to see those problems every week for the rest of the year.”
Some educators question whether practice on digital platforms, even when used in the prescribed amount, leads to better math performance—especially for students far behind. Jonathan Regino, the math supervisor and curriculum designer for a small, blue-collar district in Pennsylvania where nearly half of students live at or below the poverty line, said that when he arrived new to the district in May 2023, most students there were using IXL for math. During the pandemic, the district’s schools had started implementing the platform in classrooms and sending teachers to professional development sessions on how to use it as part of the core curriculum. But the majority of students were still struggling with proficiency, Regino said. Only 31 percent of students were working at grade level.
Internal district data showed that out of all students, kindergarten through high school, only 2nd-grade students were seeing benefits.
“What I saw was the kids realized, especially at the secondary level, that the questions repeat because there wasn’t enough data in the system,” Regino said. “So the kids were screenshotting, taking pictures, and just rapidly going through the questions, and then they would go through it a second time, get 100 percent—they got their points—and move on to the next thing. And the teachers were running like chickens with their heads cut off, trying to help every kid, because they were all on different skills. It was pure chaos, and students weren’t getting anything out of it.”
EdNext in your inbox
Sign up for the EdNext Weekly newsletter, and stay up to date with the Daily Digest, delivered straight to your inbox.
In reading, Regino said, online assessments built into many platforms can help teachers pinpoint the places where students are struggling and can direct the digital tools to work on specific skills like fluency or decoding. The right practice for fluency work, for example, looks different than for targeting missing sounds and sound blends. But math practice programs haven’t quite caught up to that diagnostic level.
“All these programs don’t actually intervene at that same level. In math, you need to know if it’s an acquisition issue, a fluency issue, [or] a generalization issue, and each one of those has a very particular type of intervention,” he said. Fluency in math refers to how quickly and accurately the learner performs an operation, while acquisition refers to the stage in which the student is first learning or understanding a concept. Generalization involves applying a concept to a broad range of problems. So, if a student hasn’t yet fully acquired the skill of dividing fractions, for example, the teacher needs to decipher what mistake the child is actually making to know what to do next. Is it that they don’t understand that a fraction is a number? Or what dividing does to a fraction? Or do they need to relearn the rules for dividing fractions? “But [digital programs] assume it’s partially acquisition, partially fluency, so they kind of straddle the issue, and when you straddle it, you don’t actually fix it.”
Similarly, Regino and other educators have complained that some digital tools like i-Ready don’t share students’ individual responses with teachers, just general categories of strengths and weaknesses, making it difficult for teachers to target gaps and intervene accurately.
Using digital platforms for homework presents another set of challenges. For example, even though research shows consistent math homework can improve achievement, a movement has grown among parents and teachers resisting it. A few minutes of nightly practice on a digital platform could be especially useful for building and reinforcing skills.
“I’m worried about the complete lack of homework,” in math class, said University of Notre Dame cognitive scientist and researcher Nicole McNeil. “I just worry when I go into schools and the kids say, you know, we don’t have any homework. None at all, ever.”
And even when schools do assign digital math practice as homework, it’s not always as straightforward as it seems. For example, Minnesota parent Robin Corry has two students, a highly motivated daughter who is a good student, and a son who is less motivated and has dyslexia. Their math practice homework looks totally different.
Because digital math practice is based on an algorithm, where correct answers yield harder and harder problems, while wrong answers add more and more of the same problems, Corry’s daughter would finish quickly, while her son often took hours to complete his work. If her son was totally lost, there was a video to watch—but he often didn’t want to watch or pay particularly close attention to the video. Wanting to complete the homework as fast as possible—often to go outside and play—he’d rush through it, guessing at answers—something the algorithm punished by providing even more problems.
“If you take a long time to answer, or if you get them wrong, you get more work,” Corry said. “We’re talking about a 5th-grade boy, right? The homework wasn’t about how many minutes you worked on it; it was about what percentage you got right. So if you were good at IXL, it could have been a 20-minute assignment. If you were bad at IXL, it could be three hours.”
After too many nights of tears and frustration, Corry stepped in and found a workaround: She and her son would take each problem IXL provided and do it together, with pencil and paper. If her son didn’t understand what the problem was asking, she’d explain it to him. Then he would come up with an answer and type it into the platform.

Increasing the Benefits
Representatives of some digital platforms admit that boosting student engagement is an ongoing challenge.
Kristen Huff, head of measurement for Curriculum Associates, the company that developed and administers i-Ready, called it a “critical problem.” i-Ready serves 14 million students nationwide, and Huff said the firm recommends its math product Personalized Instruction, which provides individualized lessons and activities online, as an in-class companion to core instruction. An internal study showed that students in kindergarten through 8th grade who used this i-Ready component for 30 to 49 minutes a week for 18 weeks out of a 36-week school year achieved significant growth in math. In the early grades, about 60 percent of students met the recommended amount of practice time; by middle school, that number dropped to around 40 percent.
To help schools get the most out of these programs, the company provides a range of support, including ongoing training and coaching for implementation, a detailed “success guide,” and troubleshooting. Close to 40 percent of the staff work solely in schools and classrooms, Huff said, hoping to have a “constant feedback loop” going with educators.
“When we see schools who are not meeting our recommended implementation, we have a few themes that we work with them on,” she said. “Building community and culture, just really helping every person in the system—district, principal, classroom teacher, instructional coach, parent, student—understand why, when you’re sitting in front of the computer working on a lesson, it’s actually a part of the classroom theory of action.”
Trainers work with schools and staff on how to incorporate the digital lessons and practice into what teachers are already doing, Huff said. But not all students will use it as intended. Students who are the most behind—those in “Tier 3,” in classroom parlance—often aren’t using digital platforms because they are the ones most likely to be pulled for small-group or individual intervention with a teacher. These students often aren’t proficient enough to be “let loose” on digital practice.

Some platforms are looking at how to combine the best of all worlds—the benefits of paper and pencil, digital tools, and teacher input. ASSISTments, a free, independent digital math tool created by a researcher and his wife, allows teachers to select, create, or assemble digital problem sets for students. Teachers can make individual assignments for the whole class, a subgroup of students, or individuals—but all assignments are chosen or created by educators, not an algorithm. In many ways, said co-executive director Britt Neuhaus, their biggest competition is the traditional worksheet. “Who has time to grade worksheets every day?” Neuhaus said. “It is creating more value for the teacher.”
“If today the teacher is teaching ratio, for example, they make a decision: make the assignment in ASSISTments that’s about ratio or something focused on prerequisite knowledge to ensure students can learn ratio right,” said Mingyu Feng, research director of learning and technology at WestED, which performed an independent study of the platform.
The ASSISTments platform still offers the best benefits of technology—for example, students get immediate feedback on whether answers are right or wrong (unless they’re working on open-response problems)—and they can’t move on until they’ve solved the problem correctly.
The platform provides hints on how to tackle a problem if needed, and it produces several kinds of reports for educators, including one showing how students performed on each problem. This allows teachers to tailor future instruction and practice based on how students actually performed on the homework.
For a fee, teachers can also receive professional development and coaching on how best to implement the platform as part of core instruction. But the teacher is still in the driver’s seat when it comes to what material the students are working on digitally. The point of the platform is to closely link teachers’ classroom decisions for their students to data that can support their next instructional move, Feng said. “It’s mentioned a lot in math education that teachers should do data-driven instruction. But without support from computers, it’s pretty hard.”
Feng and her team performed a three-year, randomized controlled trial of the ASSISTments platform as a homework intervention for middle schoolers across two states. Students using the digital homework designed by the teacher performed significantly better in math than students in the control group—not just that year, but in the following years as well.
The nonprofit behind the platform, says that ASSISTments is currently serving about 100,000 students in all 50 states—a small share compared to the massive reach of the popular algorithm-based platforms. Up until two years ago, when the developers began a more formal outreach program to districts, ASSISTments had relied solely on teacher word of mouth, Neuhaus said.
Often, research-based tools don’t have marketing teams behind them, says Feng, and developers struggle to get the word out even when the product has demonstrated its effectiveness in helping students learn.
“The user interface of the platform wasn’t as polished by designers as some long-standing, commercially available tools,” Feng said. “That’s a common issue with products made by researchers. The product is much improved now and is integrated with commonly used learning management systems like Google Classroom or Canvas.”
Kane, the 7th-grade teacher from Colorado, has found his own digital solution, similar to ASSISTments—DeltaMath. Nearly all of Kane’s students are behind in math, and he feels that creating his own digital problem sets allows him to customize them to students’ needs. But he also relies on a much older form of technology.
“Digital can often be hard and unreliable—students will lose their charger, or whatever. They’ll do it for five or seven minutes, get a dopamine boost from watching the problems disappear from the screen. With some exceptions, most 7th-grade math skills require a pencil and paper.”
Holly Korbey is an education journalist, author, and editor of The Bell Ringer Substack. She lives in Nashville, Tennessee, with her family.
Suggested citation format:
Korbey, H. (2025). “The Practice Problem: Research shows that students benefit from digital math practice platforms. So why don’t more students use them?” Education Next, 25(4), 6 August 2025.