Select Page

Media Guide on LearningRx Settlement with the Federal Trade Commission

(Updated May 18, 2016)
For additional information regarding this summary, please contact LearningRx Vice President of Research & Development Tanya Mitchell at 719.264.8808 or [email protected]

I.           Overview
In May 2016, LearningRx Franchise Corporation (“LearningRx”) settled an advertising investigation by the Federal Trade Commission (FTC), as part of an ongoing review by the FTC of marketing practices within the exciting, innovative, and highly promising new field of cognitive (“brain”) training. The FTC inquiry was the first and only time in the 15-year history of LearningRx that its business practices had been questioned by any regulatory agency, federal, state, or local.

LearningRx Client Outcomes and Research ResultsLearningRx vigorously disputed the FTC’s allegations that claims for its pioneering methods of personalized, intensive, one-on-one brain training lacked adequate scientific support. LearningRx has completed Randomized Controlled Trials (RCTs) on cognitive skills and IQ, (see pgs 35 and 37 of the Client Outcomes and Research Results booklet), as well as several quasi-experimental controlled studies and numerous observational studies using pre- and post-standardized assessments from thousands of clients. This large body of clinical data demonstrates that LearningRx brain training programs increase cognitive skills (including clients with diagnoses), IQ scores, reading skills, and standardized reading and math test scores for their clients regardless of previous diagnosis.

Nevertheless, while their studies show that their advertising claims satisfy the FTC’s traditional substantiation standard of “competent and reliable scientific evidence,” LearningRx made the pragmatic decision to settle its differences with the FTC, without any admission of liability or wrongdoing. After weighing the costs of time and energy to continue to defend their claims, they chose instead to focus their attention on conducting new research and studies, including several randomized control trials that are underway with major universities and independent researchers. LearningRx is highly confident that its future advertising will meet all legal standards of truthfulness, as it has in the past.

II.           Some Important Facts About the FTC Matter:

  • LearningRx hired world-renowned expert Dr. Howard Wainer to examine their research and data. He concluded that LearningRx does have competent and scientifically reliable evidence to back claims made on cognitive skills, IQ, reading, and training different populations like ADHD, seniors, autism, TBI, and dyslexia. Dr. Wainer is a distinguished research scientist, American statistician, and author, most recently, of “Truth or Truthiness: Distinguishing Fact from Fiction by Learning to Think like a Data Scientist” (Cambridge University Press, 2016).
  • The FTC, a non-scientific agency, has made the scientific judgment – wrong in LearningRx’s judgment – that many brain training claims must be held to the same standards of proof applied to the testing of pharmaceutical drugs, namely, randomized controlled trials, or RCTs. Since 2015, it has been applying this standard in settlements not just with LearningRx, but a number of other companies in the brain training industry.
  • At the beginning of 2016, the LearningRx team met with the FTC Commissioners. Based on their research and studies, they were successful in persuading the FTC to drop a number of claims from its complaint, including improvements in IQ and cognitive functioning of dyslexic clients.
  • RCTs are not the only permissible experimental method in the research guidelines of any field linked to cognitive training, including education and psychology. Nevertheless, the FTC unilaterally has decided that advertising claims for cognitive training services, which pose no health or safety risk, must have the same standard of proof as drug claims. To LearningRx’s great consternation, the FTC also retroactively applied this new rigid standard to claims it was making as far back as 2010, when there was no reason to believe that brain training claims could only be supported by an RCT.

    In a 2015 settlement with a vision training company, over 75 scientists posted a response on the FTC’s website that supported the company’s research and opposed the FTC’s decision to create this new standard without consulting a broad array of experts in the industry. See https://www.ftc.gov/policy/public-comments/initiative-625

III.           Excerpts from LearningRx’s Submission to the FTC Inquiry:

A. The LearningRx Program

The proprietary curriculum includes a 60-hour foundational training program, called ThinkRx, which targets seven primary cognitive skills and multiple sub-skills through repeated engagement in game-like mental tasks delivered one-on-one by a cognitive trainer. The tasks emphasize visual or auditory processes that require attention and reasoning while incorporating the components of intensity, sequencing, loading, and feedback.

The ThinkRx curriculum is most often used in combination with an additional 60 hours of an intensive sound-to-code reading intervention, called ReadRx, or a 60-hour intensive math intervention, called MathRx. The ReadRx intervention focuses on auditory processing, basic code, and complex coding skills necessary to improve reading rate, accuracy, fluency, comprehension, spelling, and writing. MathRx develops comprehension, numerical fluency, and higher level thinking skills, the core underlying cognitive skills required to learn mathematical concepts, solve problems, and perform mathematical calculations. The interventions are delivered over the course of 12 to 24 weeks. Students are trained with each procedure to mastery, and a detailed progression through the program is maintained in workbooks to ensure consistency in implementation across students.

B. LearningRx’s Commitment to Research and Advertising Substantiation

LearningRx has been committed to scientific research of cognitive function and training methods since its inception. Over time, it has invested hundreds of thousands of dollars in numerous clinical studies, both quasi-experimental observational studies and RCTs, to evaluate and document the cognitive improvements of individuals who have gone through its program.

Since becoming aware that RCTs are mandatory for cognitive claims, LearningRx (even though it disagrees with a rigid RCT standard) has responded diligently and in good faith by: (1) initiating a series of additional RCTs, including one just completed and submitted for publication, two in progress, and others planned for 2016.

C. Competent and Reliable Scientific Evidence for LearningRx’s Efficacy Claims

LearningRx’s core advertising claims report improvements in various learning skills based on robust actual data LearningRx has collected over many years; namely, the pre- and post-performance of its students on well-accepted test instruments, including, in particular, the Woodcock-Johnson III (“WJ III”) test, the gold standard of cognitive skills testing. The use of the WJ III test to measure progress from short-duration educational interventions is standard practice in educational research.

The data LearningRx has collected from over 17,000 students since 2010 show statistically significant gains in all tested areas of learning skills; including logic & reasoning, working memory, processing speed, auditory processing, visual processing, and long-term memory. The data show a statistically significant improvement in IQ of 12-21 points, with an average increase of 14 points. Students taking the ReadRx program showed a statistically significant increase in their reading skills, as shown by the WJ III achievement tests, with gains of 2.8 years in reading comprehension, 2 years in reading fluency, 3.5 years in word attack skills, and 6.3 years in sound awareness. Students taking the MathRx program showed statistically significant increases in math skills, with gains of 3.4 years in math fluency, 2.2 years on applied problems, and 3.5 years on quantitative concepts.

Within this large dataset, LearningRx has tracked the performance of individuals who were previously diagnosed with ADHD, autism, dyslexia, TBI, age-related memory decline, and other conditions affecting their ability to learn.[1] LearningRx does not claim that its program will treat any of the underlying health conditions. Rather, LearningRx reported the statistically significant learning skills improvements for its students in these categories, improvements that have been seen in both quasi-experimental observational and controlled studies, including a recently completed RCT that included ADHD subjects.

LearningRx training has substantially improved reading skills of its students, including for persons with dyslexia, averaging an age-equivalent increase of 3.64 years in as little as six months. This claim is verified by LearningRx’s data.

The improvements in learning skills, as shown by the WJ III test, are not short-term gains. A substantial portion of the sample was also tested one year after concluding their LearningRx training. The one-year results show that the improvements in cognitive skills were largely retained for all relevant skills.

In support of LearningRx’s core claims, there are also three controlled trials that confirm the results shown in LearningRx’s own pre-post data. The first study, led by Dr. Oliver Hill, is entitled, “The Efficacy of the LearningRx Cognitive Training Program: Modality and Transfer Effects.” In the first part of this two-phase study, the LearningRx group showed statistically significant gains in working memory, long-term memory, and matrix reasoning. In Phase 2, both LearningRx experimental groups outperformed the control group on matrix reasoning. Dr. Hill’s research was focused entirely on math, math self-efficacy, and the math-related skills of reasoning and memory. The study was published in the peer-reviewed Journal of Experimental Education in October2015.

A second controlled trial, led by Dr. Dick Carpenter at the University of Colorado Colorado Springs, entitled “Training the Brain to Learn: Beyond Vision Therapy,” has been published in the peer-reviewed Journal of Vision Development and Rehabilitation. The focus of the study was on all seven primary cognitive skills. The treatment group achieved statistically significant improvements in all cognitive skills evaluated by the WJ III test — six of which were significantly higher than the control group, including associative memory, visual processing, logic & reasoning, processing speed, word attack, and auditory processing.

The third study, an RCT entitled “LearningRx Cognitive Training Effects in Children Ages 8-14: A Randomized Controlled Trial,” led by Dr. Dick Carpenter and neuroscientist Dr. Christina Ledbetter from Louisiana State University, was submitted for publication in November 2015. The treatment group showed statistically greater gains in seven of the eight cognitive measures than did the control group.  The mean IQ gain for the LearningRx group was 21 points (considerably higher, even, than the 14-point gain seen in the pre-post data analysis) compared to a mean loss of 4 points for the control group.

There are two additional RCTs in progress. The first is led by Dr. Jason Downer at the University of Virginia’s highly-esteemed Curry School of Education. The focus of the study is on the effects of ReadRx for at-risk elementary school students. The study launched in November 2015 and is expected to finish in June 2016. The second RCT in progress is led by Dr. Dick Carpenter from University of Colorado Colorado Springs and neuroscientist Dr. Christina Ledbetter from Louisiana State University to compare the one-on-one ThinkRx program with a hybrid delivery model of the ThinkRx program. The study began in September 2015.

A third RCT is also underway to add to the research evidence on cognitive treatment for TBI, the study launched earlier this year at The Brain and Body Health Institute in Lakeway, Texas. A neurologist, Dr. Robert Van Boven, will lead a randomized controlled trial to examine the effects of LearningRx on cognition for soldiers and retired NFL players recovering from TBI. Dr. Van Boven is an expert in TBI research and is also the lead investigator for a Department of Defense TBI study at Fort Hood.

Concerning the academic, income, and athletics claims, abundant scientific research corroborates the common sense notion of a correlation between cognitive ability and achievement in school, careers, and sports, and the corollary that improvements in cognitive functioning should enhance one’s ability to achieve in those critical life areas. This link contains the abstracts of over 90 studies confirming the existence of such a correlation.[2] This body of research aligns with the reported results of better grades and test scores achieved by numerous LearningRx students after completing their cognitive training. It underscores the essential truth of LearningRx’s claims that increases in cognitive ability can and often do lead to greater academic, financial, and even athletic success in life.[3]

Concerning the comparison of LearningRx’s one-on-one cognitive training to tutoring, LearningRx respectfully submits the comparison was apt and not overbroad. In making the comparison, LearningRx cited a LearningRx Training Results Report which discusses a specific Chicago Public School system study of over 56,000 students enrolled in 30 different tutoring programs. Although LearningRx continues to believe its comparisons were entirely accurate, to further emphasize the specific nature of the comparison, LearningRx added a disclaimer to the statement as follows:  “Comparison based on 2005 Chicago Public School system study of over 56,000 students after one year of tutoring in 30 different tutoring programs.” The disclaimer then provides a link to the full study.

Also noteworthy for this comparison is research revealing small changes in achievement for students completing a tutoring program, in contrast to supplemental learning programs, like ReadRx and MathRx.  Specifically, post-tutoring reading achievement gains ranged from 4.43 years to losses of -3.44 years, and math achievement gains ranged from 2.91 years to losses of -6.1 years. In contrast, ReadRx students achieved reading gains ranging from 2.6 years to 6.1 years with a mean gain of 3.6 years, and MathRx students achieved math gains ranging from 2.4 years to 3.9 years with a mean gain of 3.4 years. (see: “Achievement Outcomes for LearningRx Students: Math and Reading Achievement Before and After Cognitive Training”)

D. Expert Validation of LearningRx’s Substantiation

LearningRx’s substantiation has been reviewed by Dr. Howard Wainer, Distinguished Research Scientist with the National Board of Medical Examiners, retired Professor of Statistics at the Wharton School of the University of Pennsylvania, former Principal Research Scientist with the Educational Testing Service, and author of numerous articles and books on statistics. See Dr. Wainer’s analysis and findings, along with his curriculum vitae. He concludes that:

(1) Research other than RCTs, including powerful observational studies, can and commonly do provide “evidence to estimate causal effect”;

(2) LearningRx’s observational and randomized controlled studies “all support the credibility of the claims made that those trained using the LearningRx approach markedly increased scores on IQ tests and, more particularly, on sub-scores of those tests”;

(3) Randomization was not feasible for the small subgroups, such as TBI, autism, and dyslexia, but “with effects as consistently large as was being observed, the decision to go forward, as was done, was both ethically and scientifically preferable. Within the constraints that exist, therefore, there is convincing evidence supporting the claims of efficacy made.”

In sum, LearningRx’s efficacy claims are well-supported by the pre-post data from the many thousands of students who have completed the program, RCTs, the expert opinion of an eminent statistician, and the scientific literature with regard to the claims of correlation between cognitive ability and academic, financial, and athletic achievement.

E. Acceptable Research Designs Other than RCTs in LearningRx’s Field

The FTC’s insistence on RCTs as the only valid — indeed, only lawful — form of proof of the efficacy of cognitive training programs is entirely foreign to LearningRx’s research experience and background, which has taught them that many different types of research are recognized as appropriate and acceptable in their field. LearningRx views itself principally as a supplemental educational services provider in that its mission is to sharpen cognitive skills to enable individuals to think, learn, and study better.

In this regard, in 2008, the American Educational Research Association (AERA) created a definition of Scientifically Based Research at the request of Congress. The AERA defined acceptable methods of scientific research to include observational or experimental/quasi-experimental designs, randomized designs or statistical matching, and basic research, applied research, and evaluation research. As a supplemental educational services provider, LearningRx’s research should fall under these flexible standards and not be pigeon-holed into a rigid RCT requirement prescribed by a single, non-scientific federal agency.

Even if the LearningRx model of cognitive training were to be viewed as a treatment rather than an educational intervention, the evidence LearningRx has compiled would meet the research standards of the American Psychological Association (APA). The APA endorses multiple types of research, including clinical observation, qualitative research, case studies, single-case experiments, ethnographic research, naturalistic interventions, and process-outcome studies in addition to RCTs and “their logical equivalents.”

The scientific evidence standards accepted by the American-Speech-Language-Hearing Association (ASHA) also are relevant to LearningRx’s work, including, in particular, its work with TBI soldiers, which it bills out as cognitive therapy using principles and techniques of speech, vision, and audiology therapy. Taking into account the often frequent practical challenges of conducting RCTs, ASHA also looks at other research designs, including case reports, controlled single-subject, quasi-experimental designs, and true experimental designs in evaluating the effectiveness of communication intervention methods.

LearningRx’s founder Dr. Ken Gibson’s own field of optometry also allows for multiple types of research, as evidenced by the American Academy of Optometry Clinical Research Award guidelines, which solicit applications for observational studies, case-controlled studies, and cross-sectional studies in addition to RCTs. Dr. Leonard Press, the Editor-in-Chief of  the Journal of Vision Development and Rehabilitation, which publishes articles on visual cognition, concurs, stating that he is “very concerned about any standard imposed requiring that claims of efficacy of treatment can only be supported by prospective, randomized, double-blind clinical trials. To my knowledge, there isn’t any clinical journal that constrains its peer review process to accepting manuscripts limited to RCTs, or claims of efficacy linked only to outcome measures of RCTs.”

In accepting a research article by Dr. Gibson et al. in 2015 entitled, “Training the Brain to Learn: Beyond Vision Therapy,” that was not an RCT, Dr. Press notes that the Journal’s reviewers found the methodology to be “sound, supportive of its claims, and with appropriate documentation,” and disclaimers. Id.

As an expert himself in visual processing, and a pioneer in the development of cognitive training techniques based on principles of visual (and audio) processing, Dr. Gibson always has followed the guidelines of his profession which permit and encourage clinical research using a variety of methods,  and not only RCTs.

F. Lack of Basis and FTC Guidance for a Rigid RCT Requirement

Coming from a field that recognizes the validity and value of varied types of research, and never having been questioned by any regulatory agency before in over 30 years in the cognitive training field (including in an earlier licensed training program), Dr. Gibson never understood, until being advised by the FTC, that the Federal Trade Commission took the position that an RCT for cognitive claims was mandatory and that the failure to have one was illegal and could create a multi-million dollar liability.

The substantiation standard for evaluating the truthfulness of all other health claims is “competent and reliable scientific evidence.” Quoting from the FTC’s “Dietary Supplements: An Advertising Guide the Industry,” which defines the standard:

Randomized controlled trials are not required…

“competent and reliable scientific evidence” is a “flexible” standard, and “[t]here is no fixed formula for the number or type of studies required.” Although “well-controlled human clinical studies are the most reliable form of evidence[,]” they are not necessary…one should look to the “totality of the evidence“…studies on the precise formula used in the advertised product are not required. [4]

In the ruling for Bayer, the court held that the “competent and reliable scientific evidence” standard in its order — the very same one that is in the FTC Advertising Guide which specifically states that an RCT is “not required” for dietary supplement claims — failed to adequately apprise it that the only acceptable substantiation for its PCH claims was an RCT. This was particularly true for the RCT standard that the FTC expert espoused, which would require a: (1) randomized; (2) placebo-controlled; (3) double-blind; (4) human clinical trial; (5) done in the target population; (6) with the specific product at issue; (7) using appropriate statistical methods; and (8) designed with the desired outcome as the primary endpoint.  The FTC, the court found, “presented no evidence of any law, regulation, or guidance that would have provided notice to Bayer that [such] RCTs are required for the PCH claims at issue.” [5] (Emphasis supplied.)

Due process and basic fairness demand adequate notice of a legal duty before one can be sanctioned for violating it. While the context in Bayer was a contempt proceeding in which heightened notice requirements apply, Dr. Gibson and LearningRx were still entitled to a measure of fair notice, well before the Civil Investigative Demand (“CID”) arrived in the mail in August 2014  and discussions with the staff began, that as far as the FTC was concerned, RCTs were the only legal way to substantiate efficacy claims for cognitive training. Unfortunately, they were never given fair notice — never even had the opportunity to obtain it — for there is “no evidence of any law, regulation, or guidance that would have provided notice” to them that RCTs are required for the claims at issue.[6]

LearningRx is confident, as explained above, that the “totality of the evidence,” consisting of multiple quasi-experimental observational studies of thousands of students; two controlled studies in the books;  one new RCT completed; two in progress; and others slated to begin; passes the test of “competent and reliable scientific evidence” and substantiates the core truthfulness of its efficacy claims.

G. Testimonials

Students and parents of students have described the benefits of LearningRx cognitive training, and expressed their appreciation for what it has meant in their lives, in thousands of genuine, heartfelt letters and testimonials. A sampling of hundreds of them may be found at http://birdeye.com/learningrx-607782240.

Further, on a scale of 1 to 10, with 10 being the highest rating, 71% of parents rated the LearningRx programs a 10 and 24% rated them an 8 or 9, with the mean parent rating for the programs being 9.5 out of 10. Benefits identified by parents include increased confidence, better grades, improved standardized test scores, and better performance in sports.

[2] For example, Freberg et al (2008) reported that full-scale IQ is a significant predictor of academic achievement among 6-13 year olds, and Hogan et al (2010) reported a significant relationship between verbal IQ and academic achievement among high schoolers. Further, statistically significant relationships between IQ and employment (Lynn & Zietsman, 2013) and IQ and economic growth (Meisenberg & Lynn, 2013) have been reported in the peer-reviewed literature. These studies built on initial research by Ree et al (1994) who reported that general intelligence is the best predictor of job performance. Supporting the common knowledge that cognitive ability is related to sports performance, Cona et al. (2015) reported that cognitive factors are key contributors to marathon performance; and Faubert (2013) reported that mental processing skills are critical elements of elite sports performance.

[3] Athletic performance can be enhanced in any number of ways. Better performance in football, for example, can result from improvement in memory (remembering the count, assignment, or play), attention (not jumping offside, attending to the play call in the huddle), and visual processing (anticipating the flight of the ball, the movement of the defender that needs to be blocked, the speed and position of the runner that needs to be tackled, etc.).

[4] If RCTs are not required for claims for dietary supplements, which can have adverse effects, this heightened standard should not apply to claims for a cognitive training program, which presents no risk of harm to its participants.

[5] Bayer follows two other decisions which also held that competent and reliable scientific evidence does not require RCTs. See FTC v. Garden of Life Inc., 845 F. Supp. 2d 1328, 1334-35 (S.D. Fla. 2012) (when a consent decree speaks only of “competent and reliable scientific evidence,” the government cannot “require [the] court to read additional requirements into the Consent Decree.”), aff’d in part and vacated in part, 516 F. App’x 852 (11th Cir. 2013); Basic Research, LLC v. FTC, No. 2:09-cv-0779 at 26-27 (D. Utah Nov. 25, 2014) (by demanding “gold standard” clinical trials, which “exceed[ ] the requirements of the [consent decree],” the government failed the “expectation of reasonableness.”).

[6] It would have been immensely helpful to LearningRx and other practitioners in the relatively young and evolving field of cognitive training if the Commission had provided earlier substantiation guidance specifically applicable to LearningRx and other practitioners in the relatively young and evolving field of cognitive training if the Commission had provided earlier substantiation guidance specifically applicable to them, other than through sporadic, contradictory consent orders, or at least sent it in a warning letter. If dietary supplement marketers are permitted a pre-enforcement warning about “extraordinary” weight loss claims and the need to have an RCT well after the Commission had manifestly made RCTs the standard for that product category, LearningRx surely, in all fairness, should have received the same courtesy before any clear notice of an RCT requirement for cognitive claims had been given by the Commission at all.

 

Return to Top