Home Language Testing and Certification Another Year of Confidence in Emmersion Language Tests

Another Year of Confidence in Emmersion Language Tests

by Emmersion
manager helping employee on computer

As we near the end of 2021, and as we conclude another year of helping companies with the digital transformation that has been so undeniably impacted by the global pandemic, we thought that it may be helpful to look back on one area that we know is important: data integrity and score confidence. 

Last year (2020), when the pandemic was young and its disruptions were very fresh, I wrote a blog post highlighting the then state of Emmersion’s test security features. (You can find it here.) In this post, I’ll restrict myself to only those features that have been released or refreshed since the previous post’s release. 

The need for remote testing flexibility remains high

Before (and even early in) the pandemic, as we were all scrambling to convert long-held processes that depended on being in shared spaces and create safety through distance, we each perhaps started making guesses about what changes would stick. We were grateful that even our clients who had previously used on-site proctoring of our assessments were able to quickly pivot and make full use of our always-remote-capable tests. News of variants and surges and rekindled hesitations about what to expect going forward have only lengthened the forecast that some of these adaptations will be part of a new normal. 

Smiling call center agent having conversation while working from home

In all of the industries we service, we have seen what feels like increasing comfort and even, perhaps, preference to keep applicant screening and language skill credentialing flexible. While we may see some drift back to close contact practices, we may not. Either way, we’re ready and our tests are secure. We have not seen any patterned changes in fraud frequency or concentration since a move to predominantly remote testing. However, we will continue to innovate to maintain security so that we can maintain the confidence of our clients. 

Increased depth and breadth of our adaptive item banks

As I highlighted with the release of the first fully adaptive, automated English speaking assessment, we don’t consider adaptivity to be a feature. We consider it to be part of our identity and an area where we separate ourselves from many other solutions. Fully adaptive assessments offer a key barrier to fraud. One test taker may be able to describe the test experience in general terms to someone who hasn’t taken it, but providing any specific assistance is far less possible than with a fixed form style test.

In order to maintain and further optimize our uniquely adaptive assessment, we’ve doubled our adaptive item bank since its release. Each assessment given has several unscored items that are being calibrated. Only after a rigorous psychometric evaluation, items are graduated to our production item bank and become scored. We’ve grown our item bank generally; we’ve also grown our item bank by focusing on specific difficulty bands that have higher exposure rates than others. With each assessment given, our assessment gets smarter. That feels good to say and even better to observe. 

Manager giving presentation to team in meeting room

Screen capture to prevent inappropriate use of online assistance features

There are very few of us who haven’t had our work lives greatly disrupted by the events of the pandemic. One way that work is, has been, and likely will continue to be different is that we have lost more and more control over when and where we may work to get things done. Online browsers have wanted to provide people with the ability to stream content with and without the need to have audio on. 

You have likely experienced this. Much, much more content online is delivered with accompanying subtitles. Online meeting tools like Zoom and Google Hangout provide live transcription flexibility that can help when you not only need to mute microphone but also need to mute audio playback. 

Recent developments in live transcription of streamed content has pulled our interests in two directions. On the one hand, our solution and approach relies on highly accurate speech recognition. The investments and advancements in speech recognition open exciting opportunities for us to continue to innovate and make our solutions that measure speech performance more dynamic and accurate. However, live transcription of audio playback could undermine our speaking test’s use of elicited imitation. 

We’ve gotten ahead of this potential threat by adding screen capture during the assessment experience. These screen captures can be shared with stakeholders and support efforts to identify individuals who have inflated their true ability through the use of online assistance features. 

Consolidated Screen for Identity Check and Integrity Pledge 

One of the features highlighted in last year’s fraud prevention blog post was the inclusion of two interactive screens where test takers are informed of and have the opportunity to commit to the expected terms and conditions of the assessment. These interactions heighten elements of ego and honor that have been shown to provide fraud preventative effects. This feature has been widely adopted and likely has contributed to some of the stability we’ve seen in fraud instances. 

Previously, we had separated the interactions for identify check and the integrity pledge across two separate screens. As we added the screen outlining and setting up screen capture, we realized that we could consolidate the interactions related to the identity check and integrity pledge into a single interface. This allowed us to keep the set up phase of each test lean. 

A Word about Proctoring

Not surprisingly, proctoring comes up in most discussions around test security and score confidence, frequently enough that I feel like not addressing it here would be a disservice to the thinking around proctoring that has been done. I remain compelled by the research I referenced last year on the surprising ineffectiveness of proctoring at reducing cheating. A proctor’s role in helping communicate expectations around taking an assessment can be helpful, but strictly in terms of ‘watched test takers cheat less,’ I think many people who know more than I do about test security would say ‘you’d think that but…the data doesn’t confirm it’.

While the screen capture work referenced above is a step towards creating more direct visibility into a remote test taker’s experience and actions during the test, we don’t offer integrated proctoring beyond that. We do have clients that have integrated with proctoring tools, and we’ve investigated options and alternatives towards similar solutions. We’d love to hear more from current and prospective clients at the unique value they’d derive from proctoring solutions. 

What to look forward to in 2022

We are constantly anticipating, researching, discovering and then developing new advancements in our assessments. Some areas related to score confidence that we look forward to working on in 2022 include continuing growth of item banks, deeper analysis of open response content and development towards adaptive tests in non-English language products. 

We’ll continue to look into security features around identity confirmation and preventing test fraud. We certainly will continue to listen to the concerns, perspectives, interests and ideas others have related to this important area. So if you have ideas, we’re all ears.

Related Articles

IXL Learning ©2023 All rights reserved