There's a new app to spot sports concussions. Does it work?
It's all in the eyes
I sat down with BrainEye's app on Monday. I held my phone in two hands, elbows resting on my desk for stability, and then followed a small bobble with my eyes as it wandered across my screen.
This is a test of 'smooth-pursuit eye movement': the ability to keep a moving target centred on my retinas. Athletes with concussion tend to struggle with this test, their tracking slow, the target often missed.
'Almost half the neurons in the brain are involved in processing vision,' says Fielding, who is also a research fellow at Monash University's Department of Neuroscience.
'Concussions hit the brainstem and frontal lobes particularly hard. When you smack your brain around, it's disrupting networks. The brainstem is especially vulnerable. That's where all the signals are generated for an eye movement.'
A concussion causes short-term effects on neural patterns and, potentially, longer-term harm to the brain tissue.
My BrainEye test informed me that I did not have a concussion, which is good news. Can it spot the red flags of concussion in athletes? That's where it gets murkier.
Does it work?
The company has received a lot of positive press for a validation study it ran on AFL footballers, in which the tech spotted 100 per cent of footballers with concussions, and had a false-positive rate of about 15 per cent.
We should note other tests for sport-related concussion are not 100 per cent robust. A 2023 systematic review found the sensitivity of three common tools was between 50 and 88 per cent; all tools had false-positive rates of about 15 per cent.
But the study itself, funded by BrainEye and published in Sports Medicine - Open in March, has several issues. First, the sample size: 11 concussed AFL footballers. Total (plus baseline data from 384 non-concussed players).
'Such a low sample size means this must be viewed with caution,' says the Australian Institute of Sport's Hughes, who is also lead author of the AIS Concussion and Brain Health Position Statement.
It's also worth noting here three of the four Monash University researchers who conducted the Sports Medicine - Open study now work for BrainEye.
The study was done on players the researchers knew already had a confirmed concussion. It was unblinded. And the paper does not report confidence intervals, standard measures that tell us the level of uncertainty in the data. One statistician who read the paper did a quick back-of-the-envelope calculation: it was much lower than BrainEye's.
'I would hope to see more data collected before these sorts of claims can be substantiated,' Swinburne University's leading concussion researcher, adjunct professor Alan Pearce, tells me.
The method of detecting potential changes to the brain in the study – red flags for concussion – is also intriguing.
BrainEye took two measures, smooth eye tracking and 'pupillary light reflex', the quick response of the pupil to light, and combined them into an overall BrainEye score.
It then generated a cut-off value for each measure, and for the overall BrainEye score. If an athlete's score was below the cut-off, they were assessed as concussed.
'It isn't clear how this is calculated as it isn't a direct average of these two outputs,' says associate professor Frances Corrigan, a concussion researcher at the University of Adelaide.
Indeed, of the concussed footballers in the study, one had smooth eye tracking above the cut-off, and four had pupil reflexes over the cut-off.
BrainEye tells me it no longer uses pupil reflexes in its app, and instead uses two measures of smooth eye tracking built on more than 150,000 completed tests. (I asked for additional clinical validation data, but it wasn't provided.)
Then there's the usability question. A smartphone concussion test seems like a no-brainer. But when the researchers tried to enrol AFL clubs in their study, five declined because 'they found the kit and set-up too difficult and/or time-consuming to incorporate into their existing post-concussion assessment protocol'.
Of the 10 clubs that did agree to take part, only three integrated BrainEye into their concussion screening. Even then, several concussions were missed because staff 'forgot' to use the device.
Why did clubs find it so tricky to use, given it's just a smartphone?
Well, when it was tested in 2022, BrainEye wasn't quite a smartphone. The tester version came with a custom stand and chin-rest, an LED light bar and an IR camera. Athletes had to sit on a height-adjustable chair to use it correctly.
Even with the stand, about 10 per cent of players did not manage to get the tests to return usable data, often because they were moving their heads too much.
The current version of BrainEye's app works without a stabilising stand. So is it reasonable to still rely on test data from the stabilised version of the app, which now captures different data from the eyes?
Addressing this concern, the company provided an unpublished study titled 'Clinical validation of the BrainEye Smartphone Application'.
The study tested BrainEye's unstabilised app against two medical-grade devices: the Tobii Pro Glasses 3 (RRP $13,000 plus) and the NeurOptics NPI pupillometer. It found the three devices produced highly similar results. 'Our conclusions are accurate and valid,' says Fielding.
David Hughes, of the Australian Institute of Sport, is more sceptical. BrainEye 'cannot be recommended as a reliable tool for diagnosing concussion', he says. 'Further studies are needed with improved research methodology, and we also need for these studies to be done within the community sport environment.'
We should be careful, I think, about damning an Australian innovation for not having done every study it needs; BrainEye remains under development. It is not yet regulated as a diagnostic device.
But if the tech is not yet ready for prime time, what alternatives exist?
There are non-smartphone tools that already exist for non-medicos to spot concussion - the CRT6 asks fairly simple questions such as whether the athlete has blurred vision or neck pain, or feels irritable.
We could also put in place things to minimise the risk – such as banning heading in soccer training – and invest properly in training players and coaches to spot concussion at amateur level. And we can try harder to change a sporting culture that still seems to think blunt-force trauma to the head is acceptable.
Swinburne's Alan Pearce says: 'Everybody thinks that 'tech' will save the day, but it's understanding the seriousness of the injury and cultural change towards concussion. It's not just a 'head knock'.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


ABC News
2 days ago
- ABC News
Indigenous people less likely to be waitlisted for a kidney transplant
There's a gap in the number of Aboriginal and Torres Strait Islander people who are waitlisted for a kidney transplant. A study has found while eight per cent of non-Indigenous dialysis patients were waitlisted, just two per cent of Aboriginal and Torres Strait Islander patients were. In most cases it was because of an incomplete work‐up, because they were awaiting transplant assessment or because their eligibility had not yet been assessed. Guest/s Professor Jacqui Hughes, a kidney specialist at Flinders University and one of the leaders of the National Indigenous Kidney Transplantation Taskforce Professor Jacqui Hughes, a kidney specialist at Flinders University and one of the leaders of the National Indigenous Kidney Transplantation Taskforce Professor Stephen McDonald, a specialist in the epidemiology of kidney disease at The University of Adelaide References

News.com.au
5 days ago
- News.com.au
Research highlights huge gaps in screenings for different types of cancer
More than 230,000 deaths have been prevented by Australian cancer control measures during the past six decades, new research shows. A study published in the Australian and New Zealand Journal of Public Health on Wednesday points to anti-smoking campaigns and breast cancer screenings in particular as saving thousands of lives. 'What we're seeing is a snowball effect – we're now seeing the result of investments made in cancer control over the many decades,' lead researcher Brigid Lynch said. Since the mid-60s, more than 230,000 lives have been saved by developments in cancer controls. Sixty-five per cent of the avoided deaths occurred in the final 10 years of the study period (2009 to 2018), showing progress in the fight against cancer has accelerated, the researchers say. The study finds the mortality rate for breast, cervical and stomach cancer has dropped considerably. However, the chance of dying from liver or brain cancer continues to rise. 'These are trends we've seen both in Australia as well as around the world,' Associate Professor Lynch said. 'While we know more women are diagnosed with breast cancer today because of increased screening, thanks to early detection and improved treatment options, that mortality rate is falling. 'Quit campaigns in the 1980s impacted smoking prevalence and led to a drop in lung cancer deaths.' The study involved Cancer Council Victoria, the University of Melbourne, Baker Heart and Diabetes Institute, and Monash University. Cancer Council Australia scientific adviser Bernard Stewart, who was not involved in the study, said the lives saved over the past decades were a result of prevention, early diagnosis and better treatments. But the research showed nothing more precise for cancer generally, as various cancers 'must be considered separately to reveal specific achievements or lack thereof'. 'Frustratingly, specific progress for one type of cancer is rarely applicable to all tumour types, illustrating what an insidious disease cancer is,' Professor Stewart said. 'Concerning prevention, we know the cause of virtually all cervical and lung cancer cases, some bowel and breast cancer cases, while brain and prostate cancer can't be described this way. 'Early diagnosis comes from screening for cervical (to be displaced by vaccination), breast, bowel and lung cancer, but prostate cancer remains challenging and no screening yet for liver, stomach or ovarian cancer. 'Markedly improved survivability is evident for breast, bowel and other cancers but not for lung, pancreatic and brain cancer.' For this latest study, the researchers took World Health Organisation global health data, and analysed it against age-standardised cancer mortality rates from 1950 to 2018, finding more than 230,000 deaths have been avoided. Professor Lynch hopes the findings help health authorities prioritise cancer funding and campaigns. 'It's vital that society increases investment in cancer prevention and early detection efforts to help save lives,' she said. 'We are anticipating a significant increase of cancer incidence over the coming years due to our ageing and growing population. 'Prevention is the only way we can reduce the health, social and economic burden of cancer and protect our healthcare system.'

Sydney Morning Herald
5 days ago
- Sydney Morning Herald
There's a new app to spot sports concussions. Does it work?
That's BrainEye's claim. What does the evidence say? It's all in the eyes I sat down with BrainEye's app on Monday. I held my phone in two hands, elbows resting on my desk for stability, and then followed a small bobble with my eyes as it wandered across my screen. This is a test of 'smooth-pursuit eye movement': the ability to keep a moving target centred on my retinas. Athletes with concussion tend to struggle with this test, their tracking slow, the target often missed. 'Almost half the neurons in the brain are involved in processing vision,' says Fielding, who is also a research fellow at Monash University's Department of Neuroscience. 'Concussions hit the brainstem and frontal lobes particularly hard. When you smack your brain around, it's disrupting networks. The brainstem is especially vulnerable. That's where all the signals are generated for an eye movement.' A concussion causes short-term effects on neural patterns and, potentially, longer-term harm to the brain tissue. My BrainEye test informed me that I did not have a concussion, which is good news. Can it spot the red flags of concussion in athletes? That's where it gets murkier. Does it work? The company has received a lot of positive press for a validation study it ran on AFL footballers, in which the tech spotted 100 per cent of footballers with concussions, and had a false-positive rate of about 15 per cent. We should note other tests for sport-related concussion are not 100 per cent robust. A 2023 systematic review found the sensitivity of three common tools was between 50 and 88 per cent; all tools had false-positive rates of about 15 per cent. But the study itself, funded by BrainEye and published in Sports Medicine - Open in March, has several issues. First, the sample size: 11 concussed AFL footballers. Total (plus baseline data from 384 non-concussed players). 'Such a low sample size means this must be viewed with caution,' says the Australian Institute of Sport's Hughes, who is also lead author of the AIS Concussion and Brain Health Position Statement. It's also worth noting here three of the four Monash University researchers who conducted the Sports Medicine - Open study now work for BrainEye. The study was done on players the researchers knew already had a confirmed concussion. It was unblinded. And the paper does not report confidence intervals, standard measures that tell us the level of uncertainty in the data. One statistician who read the paper did a quick back-of-the-envelope calculation: it was much lower than BrainEye's. 'I would hope to see more data collected before these sorts of claims can be substantiated,' Swinburne University's leading concussion researcher, adjunct professor Alan Pearce, tells me. The method of detecting potential changes to the brain in the study – red flags for concussion – is also intriguing. BrainEye took two measures, smooth eye tracking and 'pupillary light reflex', the quick response of the pupil to light, and combined them into an overall BrainEye score. It then generated a cut-off value for each measure, and for the overall BrainEye score. If an athlete's score was below the cut-off, they were assessed as concussed. 'It isn't clear how this is calculated as it isn't a direct average of these two outputs,' says associate professor Frances Corrigan, a concussion researcher at the University of Adelaide. Indeed, of the concussed footballers in the study, one had smooth eye tracking above the cut-off, and four had pupil reflexes over the cut-off. BrainEye tells me it no longer uses pupil reflexes in its app, and instead uses two measures of smooth eye tracking built on more than 150,000 completed tests. (I asked for additional clinical validation data, but it wasn't provided.) Then there's the usability question. A smartphone concussion test seems like a no-brainer. But when the researchers tried to enrol AFL clubs in their study, five declined because 'they found the kit and set-up too difficult and/or time-consuming to incorporate into their existing post-concussion assessment protocol'. Of the 10 clubs that did agree to take part, only three integrated BrainEye into their concussion screening. Even then, several concussions were missed because staff 'forgot' to use the device. Why did clubs find it so tricky to use, given it's just a smartphone? Well, when it was tested in 2022, BrainEye wasn't quite a smartphone. The tester version came with a custom stand and chin-rest, an LED light bar and an IR camera. Athletes had to sit on a height-adjustable chair to use it correctly. Even with the stand, about 10 per cent of players did not manage to get the tests to return usable data, often because they were moving their heads too much. The current version of BrainEye's app works without a stabilising stand. So is it reasonable to still rely on test data from the stabilised version of the app, which now captures different data from the eyes? Addressing this concern, the company provided an unpublished study titled 'Clinical validation of the BrainEye Smartphone Application'. The study tested BrainEye's unstabilised app against two medical-grade devices: the Tobii Pro Glasses 3 (RRP $13,000 plus) and the NeurOptics NPI pupillometer. It found the three devices produced highly similar results. 'Our conclusions are accurate and valid,' says Fielding. David Hughes, of the Australian Institute of Sport, is more sceptical. BrainEye 'cannot be recommended as a reliable tool for diagnosing concussion', he says. 'Further studies are needed with improved research methodology, and we also need for these studies to be done within the community sport environment.' We should be careful, I think, about damning an Australian innovation for not having done every study it needs; BrainEye remains under development. It is not yet regulated as a diagnostic device. But if the tech is not yet ready for prime time, what alternatives exist? There are non-smartphone tools that already exist for non-medicos to spot concussion - the CRT6 asks fairly simple questions such as whether the athlete has blurred vision or neck pain, or feels irritable. We could also put in place things to minimise the risk – such as banning heading in soccer training – and invest properly in training players and coaches to spot concussion at amateur level. And we can try harder to change a sporting culture that still seems to think blunt-force trauma to the head is acceptable. Swinburne's Alan Pearce says: 'Everybody thinks that 'tech' will save the day, but it's understanding the seriousness of the injury and cultural change towards concussion. It's not just a 'head knock'.'