Eric Sedletzky
Contributor
Time to post this again to illustrate how worthless AOW is.
Welcome to ScubaBoard, the world's largest scuba diving community. Registration is not required to read the forums, but we encourage you to join. Joining has its benefits and enables you to participate in the discussions.
Benefits of registering include
When using a term like mastery, you have to look at the context. The way the word is used by PADI is consistent with the way the word is used in education circles. You can see it in the explanation of mastery learning by Benjamin bloom. The idea of mastery learning is that the student works until reaching a specific standard of performance and then is done. That contrasts with standard education, in which the student works for a specified period of time is measured and scored. The definition used by Patty is consistent with the definition you use in mastery learning.I think people have an issue with this definition of mastery. I'm not one to get caught up in semantics, but I would rather see "competency" in place of "mastery." To me, mastery means having great knowledge and skill. I don't expect any OWD students to have that level of skill. They should, however, be competent, meaning having the necessary ability to perform a skill successfully.
I am not native English speaker, so, not the right person to discuss semantics but the way I read that description is mastery term applied to a skill is filtered by certification level. If I am evaluating hovering skill for an owd candidate or an aowd, I will not apply same scale.I think people have an issue with this definition of mastery. I'm not one to get caught up in semantics, but I would rather see "competency" in place of "mastery." To me, mastery means having great knowledge and skill. I don't expect any OWD students to have that level of skill. They should, however, be competent, meaning having the necessary ability to perform a skill successfully.
You are correct in that this is the way training agencies use the term mastery. For them, it is relative to student level. That's okay. I would argue, however, that it perhaps would be better to use the term "competency." No one uses the term "mastery" the way that PADI does, as far as I know (or maybe other training agencies; I'm not familiar with them all). For example, in medical education the terms "competency" and "core competencies" are used. I'm not sure why PADI uses mastery. It seems like an odd word choice to me. I suppose it's not all that important in the end, so long as students demonstrate the skills to the level they need to. But I do worry that the word choices used in standards and especially in marketing materials has a peculiar and not always positive cumulative effect.I am not native English speaker, so, not the right person to discuss semantics but the way I read that description is mastery term applied to a skill is filtered by certification level. If I am evaluating hovering skill for an owd candidate or an aowd, I will not apply same scale.
The problem is that this is not quantified so it allows a range of what's acceptable according to the instructor and not any standard. Agencies often hide behind the ambiguity, often throwing the instructor under the bus if it goes to court.but the way I read that description is mastery term applied to a skill is filtered by certification level.
As I wrote above, the term as defined by PADI is common in educational theory.No one uses the term "mastery" the way that PADI does, as far as I know (or maybe other training agencies; I'm not familiar with them all).
It would take a chapter in a book to explain it all. I will try here briefly. The system I am about to describe is used in all major performance evaluations, including essay grading on AP exams and many other such assessments. I have taught this, and I have supervised assessments done this way.The problem is that this is not quantified so it allows a range of what's acceptable according to the instructor and not any standard. So, agencies often hide behind the ambiguity, often throwing the instructor under the wall if it goes to court.
Calling of objective standards like a 3-foot window depth change allowed and within 15 degrees of trim without sculling would go a LOOOOONG way. I believe NASE and RAID are the only WRSTC members who mandate this.The problem is that this is not quantified so it allows a range of what's acceptable according to the instructor and not any standard. So, agencies often hide behind the ambiguity, often throwing the instructor under the wall if it goes to court.
PADI uses that kind of language to describe performance in hovering for decompression stops in their tech program. It is not believed to be necessary at the benchmark level for an open water diver.Calling of objective standards like a 3-foot window depth change allowed and within 15 degrees of trim without sculling would go a LOOOOONG way. I believe NASE and RAID are the only WRSTC members who mandate this.
Okay, here we are in agreement. If I recall correctly, both you and @The Chairman have been accused of lying regarding teaching entire open water courses neutrally buoyant and trimmed. I also have been accused of lying. And that is incredibly frustrating. I have asked people to just try. Nope, they've been doing it for decades, scored high in their IE, there is no room for improvement in teaching methods.As I said, it used to be my job to teach this stuff to high school teachers. We who did this we're often amazed at the seeming inability of those teachers to understand what is really a simple concept. We finally realized that they could not understand it because they did not want to understand it. While we were explaining it, they were not listening. They were devoting their thoughts to what they would say to show that we were wrong. I suspect something similar happens in scuba, at least on ScubaBoard.
RAID does it.PADI uses that kind of language to describe performance in hovering for decompression stops in their tech program. It is not believed to be necessary at the benchmark level for an open water diver.
There are very few performance areas where that kind of objective language is possible. In fact, it is often counterproductive. In presentations, I used to give examples of very precise and objective language for assessment that sounded good but clearly led to completely erroneous scorings.