Lowering Licensure Standards Puts Kentuckians’ Eye Health At Risk

Now you have me curious. Could you please give me an example of a biased question that has been found on our NBEO exam.

Not off the top o’ my head, no. We do fairly well in that area. I don’t remember crises regarding specific items on NBEO. The items are occasionally revised due to comments from candidates or councils.

But it’s not that difficult to write a biased item. People think of race and gender; those matter. But it doesn’t have to be those.

Bias could be as simple as a dx. Some terminology varies by region and college. Since licensure testing is about public protection via licensure, we don’t want a test on vocabulary, dialect, etc. Some tx varies by location (rural v. urban, west v. east). That would be a bad reason to fail unless truly wrong.

As our population and profession change, the opportunities for bias likely increase.
 
I was "doing the work" last night and came across bias mitigation consisting of candidates that have trouble reading so the candidate is assigned someone to read them the test questions because the reading of the question doesn't test their knowledge base. And similarly, people who are allowed to have more time to take the test because of some disability.

It is also problematic when a certain group scores more poorly on a test than another group and that automatically gets chalked up to bias rather than the certain group having a lack of merit. A standardized test is supposed to discriminate against those who don't know that material.

Things like fairness and bias aren't easy to define and I can see where a group like the NBEO might mistake lack of merit for unfairness. -Charlie
 
  • Like
Reactions: Jeffrey Kiener
I like you Dr. Ohlson, and I appreciate your contribution to our profession; but what you just said sounds like a lot of "who shot John"

I don’t really know what that phrase means. I’ll Google later. :)

History: I was initially in the camp of: How can an optometry test have bias? Show me bias in optics. I started out at the baseline of zero.

Turns out it’s possible. I simply didn’t know. Not a big problem for us, but possible… yeah, it is possible.

The good news is that NBEO is not considered rife with bias. Thank you, God. I don’t need more issues.

Bias can adversely affect those that are quite competent. No one wants that.

Every professional college and licensure test deals with ADA and accommodations as required. This is not unique to optometry and NBEO. One follows the law.
 
I don’t really know what that phrase means. I’ll Google later. :)

History: I was initially in the camp of: How can an optometry test have bias? Show me bias in optics. I started out at the baseline of zero.

Turns out it’s possible. I simply didn’t know. Not a big problem for us, but possible… yeah, it is possible.

The good news is that NBEO is not considered rife with bias. Thank you, God. I don’t need more issues.

Bias can adversely affect those that are quite competent. No one wants that.

Every professional college and licensure test deals with ADA and accommodations as required. This is not unique to optometry and NBEO. One follows the law.
So it shouldn't be difficult to give an example of how bias can exist in testing optics, right? If it's "not rife," that means that while not all over the exam, there WAS some, so let's have examples of those.
 
I don’t really know what that phrase means. I’ll Google later. :)

History: I was initially in the camp of: How can an optometry test have bias? Show me bias in optics. I started out at the baseline of zero.

Turns out it’s possible. I simply didn’t know. Not a big problem for us, but possible… yeah, it is possible.

The good news is that NBEO is not considered rife with bias. Thank you, God. I don’t need more issues.

Bias can adversely affect those that are quite competent. No one wants that.

Every professional college and licensure test deals with ADA and accommodations as required. This is not unique to optometry and NBEO. One follows the law.
It would be illuminating to a lot of us if you could have someone dig up maybe two examples where there was an unfair, or biased, question in an optics test question. I'm curious as to how that can be possible. I guess it would be possible if someone INTENTIONALLY worded it to be such, otherwise, ehh?
 
  • Like
Reactions: Mike Alvarez
A judge judy thing - nicely, it means I am not so sure about that...our opinions will differ. :cool:

Psychometricians, the nerds that study assessment, have differing opinions. And standards change. Sometimes you pick from options. There are different methods of setting standards. Some are better for certain situations, some not. Sometimes looking at 2 or 3 methods helps.
 
I don’t really know what that phrase means. I’ll Google later. :)

History: I was initially in the camp of: How can an optometry test have bias? Show me bias in optics. I started out at the baseline of zero.

Turns out it’s possible. I simply didn’t know. Not a big problem for us, but possible… yeah, it is possible.
So I ran across this just this morning. My son's high school bio final was a pig lab, the 'bonus' question on the quiz was "which little piggy ate roast beef?" (ie, referring to the nursery rhyme.)

It was a funny little throw-in, I'm sure his teacher has been putting it on the test for the last 40 years or however long he's been at the job [he ain't young]

... but if you think about it for more than a few seconds, it was profoundly unfair to anyone who wasn't brought up where that nursery rhyme was a thing (ie, people who aren't from the US.) Unfair because it was an 'extra credit question' that was out of the scope of some people's experience, that had nothing to do with the bio lab.)

I don't know how many points the question was worth, hopefully almost none [or maybe he'll just give everyone credit for it.]

But the point is that's an extreme example of a cultural bias in a test.

I think in STEM you see bias come up in stuff like word problems in physics, where you come up with a scenario that may culturally not be universal and is easier for people who are familiar with the topic to mentally model (ie, if you gave me a physics problem about baseball i'd have a much easier time visualizing it vs. curling. This has everything to do with being a boy raised in the US.)
 
So I ran across this just this morning. My son's high school bio final was a pig lab, the 'bonus' question on the quiz was "which little piggy ate roast beef?" (ie, referring to the nursery rhyme.)

It was a funny little throw-in, I'm sure his teacher has been putting it on the test for the last 40 years or however long he's been at the job [he ain't young]

... but if you think about it for more than a few seconds, it was profoundly unfair to anyone who wasn't brought up where that nursery rhyme was a thing (ie, people who aren't from the US.) Unfair because it was an 'extra credit question' that was out of the scope of some people's experience, that had nothing to do with the bio lab.)

I don't know how many points the question was worth, hopefully almost none [or maybe he'll just give everyone credit for it.]

But the point is that's an extreme example of a cultural bias in a test.

I think in STEM you see bias come up in stuff like word problems in physics, where you come up with a scenario that may culturally not be universal and is easier for people who are familiar with the topic to mentally model (ie, if you gave me a physics problem about baseball i'd have a much easier time visualizing it vs. curling. This has everything to do with being a boy raised in the US.)
I was raised in Texas, but the physics of curling would be easy. That is the beauty of physics, it follows universal laws. Friction at the interface.
 
So I ran across this just this morning. My son's high school bio final was a pig lab, the 'bonus' question on the quiz was "which little piggy ate roast beef?" (ie, referring to the nursery rhyme.)

It was a funny little throw-in, I'm sure his teacher has been putting it on the test for the last 40 years or however long he's been at the job [he ain't young]

... but if you think about it for more than a few seconds, it was profoundly unfair to anyone who wasn't brought up where that nursery rhyme was a thing (ie, people who aren't from the US.) Unfair because it was an 'extra credit question' that was out of the scope of some people's experience, that had nothing to do with the bio lab.)

I don't know how many points the question was worth, hopefully almost none [or maybe he'll just give everyone credit for it.]

But the point is that's an extreme example of a cultural bias in a test.

I think in STEM you see bias come up in stuff like word problems in physics, where you come up with a scenario that may culturally not be universal and is easier for people who are familiar with the topic to mentally model (ie, if you gave me a physics problem about baseball i'd have a much easier time visualizing it vs. curling. This has everything to do with being a boy raised in the US.)

That all makes sense, but is there any chance that the NBEO finds out, for example, that people from Mississippi scored more poorly on a certain question than students from other states and automatically jumps to the conclusion that people from Mississippi were subject to bias, rather than considering the possibility that people from Mississippi just aren't up to snuff?

Same thing with racial groups. A particular racial group scoring more poorly on a given question doesn't, or shouldn't, automatically equate to the bias of a question. -Charlie
 
Last edited:
  • Like
Reactions: Jeffrey Kiener
It would be illuminating to a lot of us if you could have someone dig up maybe two examples where there was an unfair, or biased, question in an optics test question. I'm curious as to how that can be possible. I guess it would be possible if someone INTENTIONALLY worded it to be such, otherwise, ehh?

When I can I will. Still seeing pts, 18 charts behind… it’s not a current fiasco. NBEO is considered not bad in this area. But, it is possible. I didn’t see it at first either. Not intuitive.

There are differing circumstances and purposes in standardized testing. Assessing young people with the Iowa… ah, Adam is adding… Test of Basic Skills (has a new name now) is fine and good. But if it utilized language, grammar, and situations unfamiliar to me, my scores would have suffered.

The PTO on the M revolves at 540 rpm, but a gelding gets loose and becomes entangled… you can lose a city feller. One could do the opposite as well.

I’m hoping this doesn’t go down a rabbit hole that isn’t really a current dilemma. The bias doesn’t lie in trigonometry or differential equations, but in the verbiage.
 
So I ran across this just this morning. My son's high school bio final was a pig lab, the 'bonus' question on the quiz was "which little piggy ate roast beef?" (ie, referring to the nursery rhyme.)

It was a funny little throw-in, I'm sure his teacher has been putting it on the test for the last 40 years or however long he's been at the job [he ain't young]

... but if you think about it for more than a few seconds, it was profoundly unfair to anyone who wasn't brought up where that nursery rhyme was a thing (ie, people who aren't from the US.) Unfair because it was an 'extra credit question' that was out of the scope of some people's experience, that had nothing to do with the bio lab.)

I don't know how many points the question was worth, hopefully almost none [or maybe he'll just give everyone credit for it.]

But the point is that's an extreme example of a cultural bias in a test.

I think in STEM you see bias come up in stuff like word problems in physics, where you come up with a scenario that may culturally not be universal and is easier for people who are familiar with the topic to mentally model (ie, if you gave me a physics problem about baseball i'd have a much easier time visualizing it vs. curling. This has everything to do with being a boy raised in the US.)

so in high school…my calculus teacher would randomly put up a question on the side board for extra credit. First person to answer it correctly would get the credit.

She had two AP calculus classes.

Well the complaints started rolling in…

“you put the question up at the start of home room. A kid in your home room class saw it and answered it before I even had a chance to look at it?”

So she put it up at an exact time.

“you put it up at 3 pm…but I am on the 2:55pm bus that’s not fair”

So she made a different Question for each class..

“you gave the other class an easy one and gave us a much harder one…that’s not fair!”

Before the internet and chrome books…these were real complaints…valid actually.

Here is an example…

“Complete the next two numbers of this sequence…61…52…63…94…”

any quesses.?

How would we make it fair to the “first correct answer” when people will see this at different times?
 
So I ran across this just this morning. My son's high school bio final was a pig lab, the 'bonus' question on the quiz was "which little piggy ate roast beef?" (ie, referring to the nursery rhyme.)

It was a funny little throw-in, I'm sure his teacher has been putting it on the test for the last 40 years or however long he's been at the job [he ain't young]

... but if you think about it for more than a few seconds, it was profoundly unfair to anyone who wasn't brought up where that nursery rhyme was a thing (ie, people who aren't from the US.) Unfair because it was an 'extra credit question' that was out of the scope of some people's experience, that had nothing to do with the bio lab.)

I don't know how many points the question was worth, hopefully almost none [or maybe he'll just give everyone credit for it.]

But the point is that's an extreme example of a cultural bias in a test.

I think in STEM you see bias come up in stuff like word problems in physics, where you come up with a scenario that may culturally not be universal and is easier for people who are familiar with the topic to mentally model (ie, if you gave me a physics problem about baseball i'd have a much easier time visualizing it vs. curling. This has everything to do with being a boy raised in the US.)
If that was an actual test credit question, the teacher should be reprimanded for doing that to kids your sons age. Might be ok for nursery school.
 
So I ran across this just this morning. My son's high school bio final was a pig lab, the 'bonus' question on the quiz was "which little piggy ate roast beef?" (ie, referring to the nursery rhyme.)

It was a funny little throw-in, I'm sure his teacher has been putting it on the test for the last 40 years or however long he's been at the job [he ain't young]

... but if you think about it for more than a few seconds, it was profoundly unfair to anyone who wasn't brought up where that nursery rhyme was a thing (ie, people who aren't from the US.) Unfair because it was an 'extra credit question' that was out of the scope of some people's experience, that had nothing to do with the bio lab.)

I don't know how many points the question was worth, hopefully almost none [or maybe he'll just give everyone credit for it.]

But the point is that's an extreme example of a cultural bias in a test.

I think in STEM you see bias come up in stuff like word problems in physics, where you come up with a scenario that may culturally not be universal and is easier for people who are familiar with the topic to mentally model (ie, if you gave me a physics problem about baseball i'd have a much easier time visualizing it vs. curling. This has everything to do with being a boy raised in the US.)

Basically, yes. It’s not difficult to create a biased item.

The little piggy that went to market didn’t go shopping…
 
That all makes sense, but is there any chance that the NBEO finds out, for example, that people from Mississippi scored more poorly on a certain question than students from other states and automatically jumps to the conclusion that people from Mississippi were subject to bias, rather than considering the possibility that people from Mississippi just aren't up to snuff?

Same thing with racial groups. A particular racial group scoring more poorly on a given question doesn't, or shouldn't, automatically equate to the bias of a question. -Charlie
I was thinking about this. I think you’re biased. :)

Your hypothetical assumes that NBEO has the resources to run studies for the heck of it. Nope. And that NBEO would jump to conclusions. This is unlikely. There’s no supportive evidence. The internal and external psychometricians don’t work like that. Nor does the other staff, ED, or BoD.

The staff are working on the actual exam. It can be stressful with long days.

On your second concept, you support the importance of proper psychometric concepts. Good! One needs to evaluate carefully.

It’s a great opportunity for a segue. Our colleagues often jump to conclusions. But I don’t mean NBEO.

Raising fees concerned increased costs, necessary spending on IT and test development, etc. The nonprofit’s audits are clean. 990s are online (although they paint snapshots). A cmte of state board members review the organization each year. The cmtes, councils, standard setting cmtes, visitors, consultants, and mtgs with ASCO, AOSA, ARBO, etc. place many eyes (pun intended) on NBEO. Conclusion: There’s a money bin. Let me know where it’s located.

The pass rates allow another example of jumping to conclusions where the evidence points in other directions.
 
Last edited:
So I ran across this just this morning. My son's high school bio final was a pig lab, the 'bonus' question on the quiz was "which little piggy ate roast beef?" (ie, referring to the nursery rhyme.)

It was a funny little throw-in, I'm sure his teacher has been putting it on the test for the last 40 years or however long he's been at the job [he ain't young]

... but if you think about it for more than a few seconds, it was profoundly unfair to anyone who wasn't brought up where that nursery rhyme was a thing (ie, people who aren't from the US.) Unfair because it was an 'extra credit question' that was out of the scope of some people's experience, that had nothing to do with the bio lab.)

I don't know how many points the question was worth, hopefully almost none [or maybe he'll just give everyone credit for it.]

But the point is that's an extreme example of a cultural bias in a test.
So what? It’s extra credit. It’s like a trivia question. You know it or you don’t. Why shouldn’t being from the U.S. and familiar with its culture be an advantage at an American school?
 
So what? It’s extra credit. It’s like a trivia question. You know it or you don’t. Why shouldn’t being from the U.S. and familiar with its culture be an advantage at an American school?
It is a biology class - not US history. Including a question that has nothing to do with the subject matter gives people who grew up in the US an advantage they should not have.

This wouldn't matter if it wasn't for real test points, but in a world where grades are curved and students compete against one another, it should be obvious why this isn't a great idea.
 
On your second concept, you support the importance of proper psychometric concepts. Good! One needs to evaluate carefully.
Hey, as long as the internal and external psychometricians aren't motivated by DEI/post-modern ideologies, I'm on board. Which brings up the question: Are the internal and/or external psychometricians motivated by DEI/post-modern ideologies? Can you offer assurance that they aren't? -Charlie
 
I've seen this story before. About 7-8 years ago there was a crisis of confidence in the Canadian boards and many provinces started accepting NBEO, started talking about pulling funding, all of which put significant pressure on their finances. At the time NBEO was also significantly less expensive for candidates.

Ultimately the Canadian boards got the message, got their act together, and ... boy have the tables turned!

So I suspect this is all part of a pressure campaign to force some changes at NBEO.
 
  • Like
Reactions: Jeffrey Kiener
Hey, as long as the internal and external psychometricians aren't motivated by DEI/post-modern ideologies, I'm on board. Which brings up the question: Are the internal and/or external psychometricians motivated by DEI/post-modern ideologies? Can you offer assurance that they aren't? -Charlie

I can’t speak for the politics of any of them. I would not bring up politics with staff or consulting organizations. Psychometricians tend to be motivated by research and math.
 
I can’t speak for the politics of any of them. I would not bring up politics with staff or consulting organizations. Psychometricians tend to be motivated by research and math.
Sorry for pressing, but could/would the consulting organizations introduce politics in the form of DEI concepts without your knowing? -Charlie
 
Sorry for pressing, but could/would the consulting organizations introduce politics in the form of DEI concepts without your knowing? -Charlie

DEI is not part of the job description of the NBEO psychometrician or the professional group that rechecks the work. I don’t feel at all pressed. I wouldn’t obsess about this from an ideological standpoint, though.

Test items with problematic bias should be revised or deleted. One wants to assess competency for initial licensure in optometry, nothing else.

The optometry boards, task forces, and cmtes I’ve served on have ialmost never discussed political ideologies or strongly partisan viewpoints. People get upset or clam up. It’s better to stick to the task.
 
  • Like
Reactions: Charles A McBride
I like you Dr. Ohlson, and I appreciate your contribution to our profession; but what you just said sounds like a lot of "who shot John"

I researched the phrase. I don’t think my answer represented rambling, but I scribble things down at work. Makes sense to me! :)

If I knew a biased test question from NBEO, I’d not post it. Disclosing items isn’t cool. Signed NDAs. So, no.

USMLE has been criticized over race and gender bias. I’ll leave that to experts. Experience has taught me to shut up when better trained, more experienced people working in a profession full-time offer their opinions.

Reading indicates that excellent item development reduces biases. That sounds prudent. The cmte and council members work like demons. Observers report being impressed. This jives with human nature. It’s generally harder to criticize organizations/people when you actually observe, interact, and understand. In the absence of that, imagination and gossip can take over.
 
Psychometricians, the nerds that study assessment, have differing opinions. And standards change. Sometimes you pick from options. There are different methods of setting standards. Some are better for certain situations, some not. Sometimes looking at 2 or 3 methods helps.
As I said yesterday ask Dr Nelson he is a master psychometrician.
 
  • Like
Reactions: Steven E. Nelson
I was raised in Texas, but the physics of curling would be easy. That is the beauty of physics, it follows universal laws. Friction at the interface.
That’s the beauty of science. Hell, there’s entire subset of sci fi movies where the “universal language” (as in the actual universe) is math and/or physics BECAUSE they’re constant.

One of my MCAT test questions back in 1994ish as a college junior was about a bullet traveling along a curved frictionless surface. I wasn’t a ballistics expert and I’d bet there were kids in the room that had never even seen a bullet, but that didn’t negate the question because the PHYSICS was unchanged.
 
  • Like
Reactions: Steve Howard
I think in STEM you see bias come up in stuff like word problems in physics, where you come up with a scenario that may culturally not be universal and is easier for people who are familiar with the topic to mentally model (ie, if you gave me a physics problem about baseball i'd have a much easier time visualizing it vs. curling. This has everything to do with being a boy raised in the US.)
Come on Adam. If that's the case, you could find bias in many situations in testing, making it almost impossible to write an unbiased test. That kind of thinking is ridiculous.

Physics is physics. It is what it is.
 
  • Like
Reactions: Mike Alvarez
That all makes sense, but is there any chance that the NBEO finds out, for example, that people from Mississippi scored more poorly on a certain question than students from other states and automatically jumps to the conclusion that people from Mississippi were subject to bias, rather than considering the possibility that people from Mississippi just aren't up to snuff?

Same thing with racial groups. A particular racial group scoring more poorly on a given question doesn't, or shouldn't, automatically equate to the bias of a question. -Charlie
There is no possibility that Mississippians are not "up to snuff".