View Full Version : What did I get wrong?
06-06-2011, 06:08 PM
Hey! I'm wondering if you can find out which questions you got wrong on the exam. I thought I did at least a 90 ... yea right! I got a 77. Sure would like to find out what I think I know that I don't!
06-06-2011, 08:25 PM
There's no record kept of which answers you got right or wrong so it's impossible for them to give you that info. You passed. That's the important part.
Welcome to the forums btw. There's a wealth of knowledge here to be shared so feel free to join in. On a side note you need to fill in your real name in the forum info so we know who we're talking to.
Congratulations on the test!
06-06-2011, 09:01 PM
I agree with Imageskj. I believe that there should be some kind of feedback on what questions you missed on the CPP exam - and I've sent the Certification Commission a note on that issue through a commission member. Think of it this way (and I'm not picking on anyone because of their scores) - if someone passes the test with a 71, that means that they don't know 29% of the information that they should know - and there's no easy way to find out exactly what it is that they don't know that they thought that they knew. If someone fails the test, it should be enough to tell the applicant to go back and study and take it again, but for those who passed the test and have no requirement or need to go back and retake it, there's no feedback given, and few, if any, are going to go back and review the entire book to see what they MIGHT have gotten wrong - so they continue to stay unknowing of what they missed.
Informing the new Certificant of what they got wrong would be a great educational tool - but that would also take additional work by those who run the certification program - and they are all volunteers - and you can only push your volunteers to do only so much since they all have studios of their own to run. Hopefully the PPCC might be able to develop some kind of automated electronic means of providing this kind of feedback in the future, but there's no telling how much programming time developing such a program would take, the resulting cost of that service, and the effect of that service upon the overall cost of the Certification program.
06-06-2011, 09:16 PM
I agree with you Rick but keep in mind that the certification commission certifies photographers. They don't educate them and it's separate for the sake of validity. It would be very simple to program the grading system to give results based on subject areas of the test. An example would be if it said you missed 10% in the film category and 5% in the lighting category. They couldn't give specific answers because that would be leaking the test questions making it possible for people to know the test questions beforehand.
06-06-2011, 09:51 PM
Yeah, I understand the scope and the responsibilities of the commission's charter, but this just falls under the category of a "wish list" - things that we would like to have if there was some kind of way to do it without compromising the process. I remember that you mentioned that you had taken an EMT class - think about it this way - a Certified EMT responds to a heart attack you're having - and the only 22 questions that he missed were those on how to do CPR... but no one told him that he didn't know how to do it... but he got his Certification and got the job anyway.
Not picking - just throwing out things that I'd like to see on our "future wish list" - and some kid of feedback to those who PASSED is on my "wish list".
There are ways to do it - like having 10 different versions of the same question and having a computer randomly pick one question out of each set of 10 for that question for that particular test - or a generic feedback remark on the subject of each question so that the test report could say that you missed the following 12 questions - and list the generic subject matter of the question - so the Certificant could go back and review those areas. It would just take a lot of volunteer time to write all the questions, and a lot of programming time to write that random selection and feedback loop program - (and - I know the programmers/IT-Guys - and I know that they are kept very busy).
06-06-2011, 11:15 PM
I think it would be a very valuable resource and it should be on a wish list. Give me time to weasel my way on the board and I'll try to make it happen. I realize a lot of the work is accomplished by volunteers but let's face it, candidates pay $100 to declare their candidacy and if 200 are doing this every quarter then that's $80,000 a year. Somewhere in that total is room for a little creative programming.
06-07-2011, 01:05 AM
Unless the test reporting has changed since I took the exam, I actually received a report which told me how I did in various subjects. I remember, as I expected, my weakest area was film type selection and usage.
06-07-2011, 01:16 AM
Give me time to weasel my way on the boardAHA! So that's your motive :D
Unfortunately, we can put a lot of things on a wishlist, but if it doesn't comply with ICE (Institute for Credentialing Excellence) standards. As Greg mentioned, Certification is not supposed to be an educating body, but testing for competence.
06-07-2011, 02:36 AM
This is the first time I've heard of ICE, so I just did a search on their Standards, and found a 31 page pdf document...
National Commission for Certifying Agencies
Standards for the Accreditation of Certification Programs
Institute for Credentialing Excellence.
Revised December 2007 (editorial only)...
While the ICE Standard does state that …To avoid conflicts of interest between certification and education functions, the certification agency must not also be responsible for accreditation of educational or training programs or courses of study leading to the certification...
it also says in Standard 13 that...
...Candidates must be provided meaningful information on their performance on assessment instruments. Such information must enable failing candidates to benefit from the information and, if psychometrically defensible, understand their strengths and weaknesses as measured by the assessment instruments.
So it sounds like while PPA could not assume responsibility for accreditation of an educational or training course for the CPP exam, that there should not be any ICE Standard violation issues with some kind of feedback being provided by the PPCC, since the ICE standard states that performance feedback should be provided. Maybe someone could look into that - or am I misreading it?
06-07-2011, 02:53 AM
Certification does provide feedback on the image portion of the evaluation ("Instrument") because it still comes down to the candidate's competence. The written exam should reflect the candidate's ability to produce the images. In theory, if you practice good photographic skills and do what the written exam tests you on, you are demonstrating what you know, and not what you've studied;)
06-07-2011, 02:59 AM
Sorry - I mistyped it - I should have written PPCC instead of PPA...
My last paragraph should read...
So it sounds like while PPCC could not assume responsibility for accreditation of an educational or training course for the CPP exam, that there should not be any ICE Standard violation issues with some kind of feedback being provided by the PPCC, since the ICE standard states that performance feedback should be provided. Maybe someone could look into that - or am I misreading it?
Or maybe pass this on to someone on the PPCC board.
06-07-2011, 03:04 AM
Rick, I've re-edited my post for clarification;)
So, you are correct that PPCC cannot accredit courses, like the various prep courses that the Meeks do as we would be in violation of ICE standards. When they mean "instruments", it has to do with the practical portion of the exam. For example, when a mechanic gets ASE certified, the examiner can tell them what they did incorrectly because the candidate still needs to sharpen their skill to get it right the next time. In our world, if we tell the candidate that their images are underexposed, then it's up to the candidate to research how to improve their exposure. If the images are flat lit, the candidate needs to work on light ratio examples...etc.
06-07-2011, 03:17 AM
I'm a man on a mission Michael :)
06-07-2011, 03:32 AM
I smell opportunity here. I think it would be a great(almost necessary) benefit for us(PPA) to add an organized online resource for learning the fundamentals of photography. It just seems like that is something that belongs here. Most of the information is already here but it's scattered amongst articles, forum threads and webinars. Pulling it all together would be extremely beneficial to new members. With so many people getting college degrees online these days I think it's about time we jumped on that bandwagon and started offering some organized education as well. Just a thought.
06-07-2011, 11:36 AM
I think a wish list opportunity is to have some form of notification done through our membership page , we could eliminate the paper and mailing costs. AS well, notifcation could be quicker and potentially eliminate the on slot of questions and emails of status to the cert. committee.
06-07-2011, 12:02 PM
Rick is right that a bet is still being missed here. Even if the PPCC can't (at this time) provide specific individual results, it can provide general statistics of how people do in various areas of the test. It can also report on what general areas a test covers and why.
It seems rather silly to me that there is no knowledge transfer between test creation and education. The Air Force put me through a 160-hour course before allowing me in the group to create tests for my particular field. The Air Force certainly doesn't work that way, and Air Force skill testing is extremely serious business. There will be someone from the "schoolhouse"--someone in the coursework business--right there with the test creators. Why? Because it's stupid to test someone on knowledge that isn't available. That's called "Playing 'Stump the Dummy.'" There certainly has to be a consideration of whether information is actually available and being taught.
A proper certification course must also be created based on real data of what people in the field are actually doing, and that data must be refreshed for every test generation cycle. People gripe about film questions on the test...well, how many people in the field are still using film? Real data is what the test creators should use to base that decision. Does the PPCC conduct any surveys of professional photographers to determine such things?
06-07-2011, 05:34 PM
Thanks for the information and discussion.
06-07-2011, 05:52 PM
Not a problem Kevin. A good debate often brings about good change. Welcome aboard and we look forward to hearing more from you.