Top 10 common pitfalls in the 2024 VCE science exams
Stacey Martin, Head of Science
,
What holds students back across Physics, Psychology, Biology, and Chemistry, and what it reveals about their thinking.
Skills are having their time. They’re increasingly coming up in exam questions and they have a bigger role in the new Victorian Curriculum for science, rolled out from next year. Looking at the 2024 exams they featured strongly and caused their fair number of problems (and likely exam result heartache) for students, often in seemingly straightforward questions.
So here’s the top 10 we identified, how these manifest and some examples from scrutinising the errors in the 2024 VCAA exams from Biology, Chemistry, Psychology and Physics.
1. Graphs gone wrong
Students continue to struggle with both constructing and interpreting graphs.
Issues:
Mislabelled or missing axes
Wrong graph type (line vs bar)
Ignoring units
Poor lines of best fit (first/last point only)
Examples:
Psychology Q7c: 80% scored zero when asked to create a two-bar graph showing a percentage difference in sleep patterns.
Physics Q16b: Students defaulted to drawing a line through just the first and last points, despite middle data contradicting this.
2. Vague or misused scientific terms
Precision in terminology matters, and students often lack it.
Issues:
Confusing accuracy vs precision vs resolution
Swapping systematic and random error
Misnaming variables (e.g. calling a DV the IV)
Examples:
Psychology Q4e: 56% couldn’t explain how the H-index corrected for systematic error.
Physics Q20: 15% couldn’t identify a definition of precision, a deceptively simple multiple-choice item.
Chemistry Q8b.iv. Many students incorrectly stated the resolution of a burette as 0.5 mL instead of the correct 1.0 mL, highlighting a gap in terminology, but also in practical scientific understanding, as students misapplied what a measuring instrument can legitimately report based on its markings.
3. Command term confusion
Students often treat explain, evaluate, and justify as synonyms for describe, which undermines their marks.
Examples:
In Psychology extended responses, students gave correct facts but failed to evaluate CBT, costing them up to 4 marks.
Biology reports noted students often skipped the comparison element in "compare and contrast" tasks, leading to partial marks.
4. Graphical interpretation, not just drawing
Even when students plotted correctly, many struggled to extract meaning.
Examples:
Physics Q16c.i: Many failed to correctly calculate Planck’s constant from the graph, either by choosing off-line points or skipping the gradient formula entirely.
Biology: When analysing enzyme activity or respiration rate curves, students often just described “it increases” without linking to molecular reasoning.
5. Guessing the equation instead of understanding the context
Formula hunting on cheat sheets was rampant, but students didn’t always understand why or when to use them.
Examples:
Physics Q6: Students could calculate a force, but many couldn’t compare it to determine if orbiting was possible, defeating the purpose of the question.
Chemistry titration questions: Many used pH = -log[H⁺] where it wasn’t appropriate, or used mole ratios without identifying the limiting reagent.
6. Poor variable identification in experimental design
Inability to correctly identify or explain independent, dependent, and controlled variables continues to be widespread.
Examples:
Psychology Q6a: 37% scored zero when asked to identify two IVs in a conditioning experiment.
Biology practical questions: Frequent confusion between the control group and a controlled variable.
7. Abstract knowledge, poor application
When questions shifted from recall to unfamiliar scenarios, many students faltered.
Examples:
Psychology Q1b: 38% scored zero because they explained memory theory without relating it to the TEETH mnemonic poster.
Chemistry: When given unfamiliar salts, students often misjudged pH due to lack of practice with unknown combinations.
8. Over-explaining or listing irrelevant content
Students often wrote everything they knew instead of answering the question.
Examples:
Biology ERQ on signalling pathways: Many responses dumped definitions of every type of receptor but never answered the specific prompt.
Physics Q13e: Students often mixed up time dilation and length contraction, trying to use both in the same argument, and ended up contradicting themselves.
9. Mathematical setups without reasoning
Students could calculate but often didn’t show why their calculation was relevant.
Examples:
Physics Q1a: Some students gave only a final answer (6.0 × 10⁻² m/s²) without any working — resulting in zero marks.
Chemistry: In calorimetry or energy questions, students used ΔQ = mcΔT but didn’t specify which mass or temperature applied.
10. Poor experimental justification or evaluation
Science inquiry skills, like identifying bias, sources of error, or validity, were often the weakest parts of responses.
Examples:
Psychology Q4d: 61% scored zero on a question about random sampling and representativeness.
Biology: In practical design questions, many students failed to explain why a control was needed, or what data would make conclusions valid.
Chemistry: Students rarely justified their conclusions based on actual data patterns in extended-response tasks.
How Edrolo supports students’ skills development from 7-12
All our VCE science courses - Biology, Chemistry, Physics and Psychology include skills videos that unpack the key science skills, and application of these in the specific science subject. For Daily Practice and Daily Plus subscribers, there’s also additional key science skill questions to accompany these videos that help students put the skills into practice.
The new Edrolo Years 7-10 Science for the refreshed Victorian Curriculum is the only teaching and learning system that explicitly teaches skills progressively across the junior years. Our key science skills units and questions scaffold students’ familiarity and confidence in skills, and expose them to the types of questions and problems they’ll encounter in senior sciences.
For example, the scenario below from Year 10:
targets key terms: Questions explicitly teach the correct definitions and use distractors to highlight subtle distinctions.
scaffolds contextual vocabulary: Embeds key terms through exam-style practice across different topics.
builds lasting fluency: Low-stakes examples from Years 7-10 to ensure confident, accurate usage in VCE answers.
An example question from Year 10 Edrolo Science
Making key science skills visible
These pitfalls aren’t just slip-ups, they reveal how science teaching must continue to go beyond content. Students need structured support in:
Visual literacy (especially graphing and interpreting data)
Scientific language (used precisely, and in context)
Reasoning and evaluation (not just calculations and definitions)
Let’s make skills as visible as content in every science classroom.
Students love Maths with Edrolo
Give students the skills and conceptual understanding they need to succeed in Maths.