QJER logo 2
[ Contents Vol 10, 1994 ] [ QJER Home ]

Standards from a curriculum and assessment perspective

Barry McGaw


In current discussion about performance standards for students in Australian schools, two distinct themes are emerging. One is about the adequacy of actual levels of performance and reflects considerations of accountability. The other is about what standards students should be achieving in schools and reflects, in part at least, economic pressures for skill improvement in the workplace. The first has primarily an assessment focus, the second primarily a curriculum focus. In the second, there is a growing willingness to think nationally, in the first a much stronger continuing commitment to state and territory responsibilities and rights.

Assessing adequacy of student achievement

Public debate about the adequacy of current performance levels is often driven by claims that they are in decline. The evidence advanced is typically anecdotal and not systematic. The first national effort to document performance levels, undertaken in 1975 for a House of Representatives Select Committee on Specific Learning Difficulties, was designed to estimate the proportions of each age group performing below defined levels of minimum competence in literacy and numeracy (Keeves & Bourke, 1976). The committee concluded that significant numbers of children were failing to reach adequate levels but acknowledged that data on a single occasion could not reveal anything about changes in levels of performance (House of Representatives Select Committee, 1976). A repeat survey in 1980, commissioned by the Australian Education Council, demonstrated no decline in performance levels since the 1975 survey.

Subsequently, plans for further national surveys were abandoned in the face of substantial opposition from both teachers' organisations and education bureaucracies, on the grounds that such surveys were inevitably narrow in their scope and thus likely to distort the curriculum. Having opposed the continuation of national assessment, the Directors-General of Education then established the Australian Cooperative Assessment Program - as a collaborative program through which the States and Territories might develop and share strategies and materials, and as a shield against further national initiatives. While the program actually achieved little in the development of shared approaches to assessment (Hill, 1994), it was the agency through which significant curriculum collaboration was subsequently commenced.

The absence of national assessment does not reflect a current rejection of assessment and monitoring programs. It reflects the constitutional allocation of responsibility for education to the States and Territories. The nature and extent of system-level monitoring is in flux but Figure 1 represents the current actual or announced position.

Figure 1

Figure 1: School years in which monitoring programs are conducted
(Adapted from Lokan and Ford, 1994)

In the past all school systems conducted syllabus-based, external examinations at the end of primary school and in the middle and at the end of secondary school. Only those at the end of secondary school remain but these provide little evidence about performance levels since results are distributed normatively, in essentially predetermined ways over the group of students presenting.

Some systems, particularly Queensland and Victoria, have attempted to develop and use grade-related performance criteria for the allocation of grades within individual subjects. However, the dominance of the normatively used tertiary entrance score, constructed as a scaled aggregate performance for each student, means that more attention is given to this aggregate than individual subject results. Furthermore, as participation rates to the end of secondary school have risen from around 30 per cent of the age group in 1980, to almost 80 per cent in the mid-1990s, the cohort of students in Year 12 has changed dramatically. The normative allocation of scores or grades in the same fashion, within the cohort of candidates from year to year, provides no means of identifying or representing changes in performance levels over time.

Tasmania and Queensland have maintained periodic surveys of performance of students at lower levels with tests of random samples of students. Victoria introduced such surveys in 1988 and Western Australia has also introduced sample testing. .'n order to report on individual students' performances to parents and schools, full cohort testing was introduced in the Northern Territory at primary level in 1983 and mid-secondary in 1989. New South Wales introduced cohort testing in 1989 and South Australia will adopt the New South Wales tests from 1995. Victoria plans to introduce cohort testing at primary level in 1995, and Queensland has announced that it will add cohort testing at two year levels to its existing sample surveys. Apart from South Australia's forthcoming adoption of New South Wales' tests, there is no comparability of assessments across the systems.

These system-level monitoring programs provide evidence about student performance levels over time. Tasmania established a survey cycle taking its data from the 1975 national survey as a benchmark. This series of surveys reveals, for Tasmania, no change in basic reading skills since 1975, and until the appearance of a decline in the early l990s, no change in basic numeracy skills (e.g. Evaluation and Assessment Unit, 1993). Surveys in Victoria, providing comparisons with data from 1975 and 1980, concluded that there had been no decline in standards of performance (e.g. McGaw et al., 1989).

The Western Australian monitoring program provides comparisons of achievement levels in writing and mathematics in 1990 and 1992. This program shows significant improvement in writing at both Years 7 and 10; and in mathematics at Year 7, but no change at Year 3 or 10. Some of these improvements were attributed to emphases in the curriculum. For example, improvement in the conventions of writing - punctuation, spelling and form of writing - was attributed to emphasis on process writing in the English curriculum (Titmanis et al., 1993, pp. 16-17).

In 1990 Queensland tested in aspects of mathematics, reading and writing at Years 5, 7 and 9. Reading and writing were assessed again in 1992, revealing 'a slight but consistent upward shift at all three year levels in reading and writing, with the most marked improvements being in writing at Years 5 and 7' (Review and Evaluation Directorate, 1993). Performance levels in mathematics were assessed again in 1993 and showed a slight improvement at the lower range of the scale at Year 5, no change at Year 7, and a slight improvement over the full scale at Year 9 (Quality Assurance Directorate, 1994).

These State monitoring programs typically use some items common to tests at the different year levels - to permit calibration of the tests at different year levels onto a single scale and thus to permit direct comparisons of performance across year levels. These comparisons reveal substantial overlap of the distributions. The Queensland reading survey, for example, provides the diagram reproduced as Figure 2, which shows the top 25 per cent of Year 5 students to be above the mean of Year 7 students, and the bottom 25 per cent of Year 9 students to be below the mean of Year 7 students. Similar substantial overlap of the distributions of Year 5 and Year 9 students is evident in Victoria in literacy and numeracy (McGaw et al., 1989) and in science (Adams et al., 1991).

Figure 2

Figure 2: Changes in distributions of reading performance in Queensland
(from Review and Evaluation Directorate, 1993, P. 10)

Within individual systems, the new monitoring programs are beginning to yield comparisons of achievement levels for subgroups of students and also comparisons over time. For the New South Wales program, a full public report was published for the first year of testing in 1989. This showed females performing better than males in literacy and numeracy at Year 3, and in literacy and numeracy at Year 6, but males performing better in measurement and space at Year 6 (Masters et al., 1991). Students of non-English speaking background and students from the indigenous Aboriginal and Torres Strait Islander populations had lower performance levels than the overall student population. Similar analyses and also analyses of emerging trends over time have been undertaken in subsequent years, but have not been published for wide distribution. There is some evidence of improvements in the performance levels of students of non-English speaking background. For these early years of the program, though, some adjustments have been made in the language in the tests to control better against any 'disadvantage' students from such backgrounds might suffer. So the effects are somewhat confounded.

The 1975 and 1980 national surveys of performance levels of 10 and 14-year-olds in literacy and numeracy revealed no decline over that five-year period (Bourke et al., 1981). Other sources of national data on achievement levels are limited to three surveys conducted under the auspices of the International Association for the Evaluation of Educational Achievement (IEA). These are the First International Mathematics Study conducted in 1964, and the First and Second International Science Studies conducted in 1970 and 1983. Keeves' (1992) comparison of achievement levels in the ten countries that participated in both science studies showed that Australian performance levels had remained constant, while those of a number of other countries had risen. In 1970 Australia ranked a clear third among 14-year-olds, but in 1983 was tied in fourth place with six other countries. In negotiations between the school systems for the two national literacy and numeracy studies, the possibility of comparisons between State and Territory systems was explicitly excluded. This exclusion may seem surprising, given that the possibility of experimentation through variations between the systems has been offered as one of the strengths of a federal system - one which has eight independent jurisdictions in a nation of only 18 million people.

The Second International Science Study, however, did provide comparisons among the Australian State and Territory systems, and Rosier and Banks (1990, p.128) show that there were marked differences. The results for the study of 14-year-olds, together with the comparative results for some other nations (Postlethwaite & Wiley, 1991, p.60), are summarised in Figure 3. There it can be seen that the Australian Capital Territory was just behind Japan, the second ranked nation; Queensland and Western Australia were close to the Netherlands, the third ranked nation; New South Wales, South Australia and Tasmania were essentially equivalent to Australia as a whole, and thus also to the set of nations tying at fourth rank; and Victoria and the Northern Territory were lower down at the level of Singapore and England, but above that of the USA.

Figure 3

Figure 3: Achievements of 14-year-olds by education system, 1983/84
(Source: Rosier & Banks, 1990, p. 128 and Postlethwaite & Wiley, 1991, p. 60)

Setting standards through the curriculum

In the late 1980s, pressure for more curriculum consistency across the State and Territory systems was mounted by the Federal Minister for Education (Dawkins, 1988) and gained support from the business community. The Directors-General of Education commissioned a succession of curriculum mapping exercises. Initially, the objective was little more than to seek to demonstrate that considerable consistency already existed and that initiatives for national consistency were unnecessary. The work, however, then became a collaborative effort to develop curriculum and assessment frameworks in English and mathematics. One key to the level of success achieved was the curriculum focus and the attempt to address directly issues of classroom instruction and assessment for the purpose of reporting to parents. Support for this initiative, particularly from teachers, was obtained because neither national nor system monitoring was among its purposes.

In April 1989, at a meeting in Hobart, the State, Territory and Federal Ministers of Education declared themselves 'conscious that the schooling of Australia's children is the foundation on which to build our future as a nation' and 'willing to act jointly to assist Australian schools in meeting the challenges of our time'. They defined ten agreed national goals of schooling, set out in what is sometimes referred to as the Hobart Declaration on Schooling (Department of Employment, Education and Training, 1990). In 1990, the Ministers, apart from New South Wales, established the Curriculum Corporation, as a jointly owned company through which collaborative work might be facilitated. New South Wales joined in 1993.

From the ten national goals, the Council of Ministers identified eight broad learning areas as the overall structure of the curriculum. These were the Arts, English, health and physical education, languages other than English, mathematics, science, studies of society and environment, and technology.

There is a certain convenience about dividing the total curriculum into a relatively small number of areas, particularly if these then become ones in which all students must study. But there is inevitably debate about the results. Traditional studies such as English, mathematics and science can reasonably be dealt with as distinct learning areas, but others are less coherent. According to Collins (1994, p.10), the document on the Arts 'is confined to making rather bland generalisations because what it says has to be valid across a range which covers music, crafts, all traditional and new visual arts and the performing arts'; society and the environment deals with 'socio-cultural questions [which] are fundamentally of a different epistemological order than questions about the physical/ecological environment'; and health and physical education wins 'the prize for the least compatible components'.

For each learning area, the Council of Ministers commissioned the development of a statement and profile. Statements provide a framework for what will be taught. Profiles set out what students are expected to learn.

A statement defines a learning area in terms of strands that specify content and process. It also provides a curriculum framework by suggesting a sequence for developing knowledge and skills within each strand across four bands, which are broad stages across the school years.

Table 1 gives the strands for the studies of society and environment learning area and identifies the scope of the bands used for all learning areas. Of the six strands into which this learning area has been organised, the first deals with key processes used in all studies in this area and the other five identify key concepts to be learned.

The statement provides some elaboration of each strand but does not provide a syllabus. It provides a structure for courses that schools or other agencies might develop.

Table 1: Structure of statement for Studies of Society and Environment
(from Curriculum Corporation, 1994c)

StrandsLearning area: Studies of Society and Environment
1Investigation, communication and participation
2Time, continuity and change
3Place and space
4Culture
5Resources
6Natural and social systems
BandsBroad, overlapping stages for all learning areas
Aroughly lower primary (years 1-4)
Bupper primary (years 4-7)
Cjunior secondary (years 7-10)
Dpost-compulsory (years 11-12)

A profile is a description of the progression in learning outcomes typically achieved by students during the years of schooling in a particular learning area. While statements are sequenced into four bands to correspond to successive but overlapping stages of schooling, profiles are sequenced into eight levels, which correspond roughly to the first ten years of schooling.

The profiles provide details for subdivisions of the strands, referred to as strand organisers. Within each strand organiser, student learning outcomes are defined for each of the eight levels. For English, for example, there are three strands: speaking and listening, reading and viewing, and writing. Each of these is subdivided into the same four strand organisers: texts, contextual understanding, linguistic structures and features, and strategies (Curriculum Corporation, 1994b). The outcome statements for two of the four strand organisers in the Speaking and Listening Strand are shown in Table 2.

For each level there is a statement which gives a general description of student performance at that level. For each outcome of the type shown in Table 2 following, there is also a set of pointers which are indicators of the achievement of an outcome. Unlike the outcomes, they are only examples. In Table 3 these elements are illustrated through the example of the Speaking and Listening Strand of the English profile for Level 5. Table 3 sets side by side the Linguistic Structures and Features Strand Organiser, and the Strategies Strand Organiser. This material is further accompanied by annotated samples of student work which demonstrate achievement of one or more outcomes at the particular level.

National statements and profiles are now available for all eight learning areas but their status is somewhat ambiguous. Having adopted the Hobart Declaration in 1989, endorsed the development of national profiles in English and mathematics in 1990, and resolved in 1991 that the structure of statements and profiles should also be the basis for national work in the remaining six learning areas, the Ministerial Council in 1993 substantially backed away from this commitment to national collaboration. One possible explanation for this retreat lies in a change in membership of the Council, and more significantly, in a change in the political balance of members following changes in some State governments at elections. The national initiative was seen in some circles as a Federal government initiative which carried risks for the autonomy of the State and Territory systems. So, with the Federal Minister in a political minority for the first time in a considerable period, there was a loss of consensus about cooperation. A more subtle explanation acknowledges that the statements and profiles were developed as a genuinely collaborative enterprise in which States and Territories had carriage of the work, but attributes their withdrawal from full cooperation to proposals from the Federal authorities to use the profiles as a basis for national monitoring and reporting. A third explanation attributes the change to public criticisms of some of the statements and profiles. For example, a group of university mathematicians were publicly critical of the mathematics profile and generated some loss of confidence in that product. A fourth explanation seeks to minimise the significance of the withdrawal by representing referral to State and Territory Ministers for individual consideration and action as a constitutional necessity, since responsibility for education rests with the States.

Table 2: Outcome statements for two strand organisers in one strand in English profile
(from Curriculum Corporation, 1994a)

LevelStrand: Speaking and Listening
Strand organiser:
Linguistic Structures and Features Strategies
Strand organiser:
Strategies
1.Draws on implicit knowledge of the linguistic structures and features of own variety of English when expressing ideas and information and interpreting spoken texts.Monitors communication on self and others.
2.Experiments with different linguistic structures and features for expressing and interpreting ideas and information.Speaks and listens in ways that assist communication with others.
3.Usually uses linguistic structures and features of spoken language appropriately for expressing and interpreting ideas and information.Reflects on own approach to communication and the ways in which others interact.
4.Controls most linguistic structures and features of spoken language for interpreting meaning and developing and presenting ideas and information in familiar situations.Assists and monitors the communication patterns of self and others.
5.Discusses and experiments with some linguistic structures and features that enable speakers to influence audiences.Listens strategically and systematically records spoken information.
6.Experiments with knowledge of linguistic structures and features, and draws on this knowledge to explain how speakers influence audiences.Critically evaluates others' spoken texts and uses this knowledge to reflect on and improve own.
7.Uses awareness of differences between spoken and written language to construct own spoken texts in structured, formal situations.Uses a range of strategies to present spoken texts in formal situations.
8.Analyses how linguistic structures and features affect interpretations of spoken texts, especially in the construction of tone, style and point of view.Uses listening strategies which enable detailed critical evaluation of texts with complex levels of meaning.

Whatever the reasons for the withdrawal from formal cooperative pursuit of a national curriculum structure, all of the systems are now actually introducing the national statements and profiles or some variant of them.

Table 3: Level 5 statement and indicators of outcomes for English
(from Curriculum Corporation, 1994a)

Level 5 Statement: Speaking and Listening
Students who have achieved Level 5 have command of a range of standard text types and features, and experiment with writing longer texts that discuss challenging aspects of subjects, and present justified views on them. They understand important elements of how texts are constructed and experiment with these elements in their own writing.

Students work will in formal groups where they take on roles, responsibilities and tasks, and they show progress in planning and delivering formal spoken presentations to their peers. They systematically listen to and record spoken information.

Students can give detailed accounts of texts in speech and writing, justifying them by referring to the text. They compare texts to examine their structures and ideas more closely, and show a sound understanding of the conventions of narrative texts.

Students use a variety of text types to write at length and with some sense of complexity. In writing longer pieces, they ensure clarity by checking layout, cause-and-effect sequences and grammar. They show a sense of the requirements of readers and experiment with manipulating prose for effect.

Strand organiser:
Linguistic Structures and Features
Strand organiser:
Strategies
Outcome:
Discusses and experiments with some linguistic structures and features that enable speakers to influence audiences.
Outcome
Listens strategically and systematically records spoken information
Evident when students for example:
  • Observe and discuss the way that voice and body language affect audiences and can be used to enhance meaning and influence interpretation (the way gestures, posture, facial expression, tone of voice, pace of speaking may engage the audience's interest)
  • Note aspects of language use, such as vocabulary, rhythm, similes, which enhance particular spoken texts.
  • Discuss and experiment with the effect of intonation on meaning (say the same word, phrase or sentence in different ways to convey regret, anger, annoyance, humour.
  • ...
Evident when students for example:
  • Prepare for listening (take pen and notebook or laptop computer to the viewing of an important video or a talk by a guest speaker).
  • Note cues such as change of pace and particular words which indicate a new or important point is about to be made.
  • Develop and use a personal abbreviation system to record information quickly.
  • ...

The smallest systems - South Australia, Tasmania, the Northern Territory and the Australian Capital Territory - for which the benefits of collaboration are greatest, are using the national materials. Queensland has adopted the mathematics statements and profiles, and with slight editing, the ones for English, but under the title 'Student Performance Standards'. Western Australia is using the title 'Outcome Statements' and has adopted the national materials in some learning areas such as mathematics; but has substantially reworked them in others, such as health and physical education. In New South Wales, the Board of Studies is progressively building the outcome statements from the national profiles into its syllabus documents - adding some additional ones (for example, handwriting in English), deleting a few, but preserving the national wording in all that it keeps.

Victoria has dropped Level 8 and collapsed the statements and profiles into a single publication, Curriculum and Standards Framework, released as a draft for discussion (Board of Studies, 1994). It retains the eight key learning areas and a strand structure. In some learning areas the material is essentially the same as the national material while in others it is different. In English, for example, the strands are the same. In the case of the components of Listening and Speaking, extracted from the national documents in Table 3, the Victorian materials differ only in deleting Level 8 and in defining Level 7 for Strategies as 'Draws on a range of strategies ...' rather than 'Uses a range of strategies ...' (Board of Studies, 1994). In others, such as mathematics and science, the Victorian materials are quite different from the national materials. The elaboration of the objectives is simpler and less extensive in the Victorian materials and there are no samples of student work to illustrate achievement of various levels.

In the first stages of the national development it was intended that the profiles have only six levels and cover the period of compulsory schooling, Years 1 to 10. The Ministerial Council later requested the addition of Levels 7 and 8 to cover Years 11 and 12. However, the diversification of the curriculum in Years 11 and 12 and the specialised nature of some of the courses made this task extremely difficult. The scope of the profiles was then limited again to Years 1 to 10, but Levels 7 and 8 were retained to capture some of the outcomes that might be achieved by advanced students in Year 10. Profiles are said to describe 'the progression of learning typically achieved by students during the compulsory years of schooling (Years 1-10)' with the twofold purpose 'to help teaching and learning and to provide a framework for reporting student achievement' (Curriculum Corporation, 1994c, p.1).

A merging of curriculum and assessment perspectives

In the 1990s these new curriculum and assessment initiatives, and the earlier ones concerned primarily with assessment, have substantially come together. Demands from business and industry groups for clear curriculum frameworks, improvement in student performance levels and the development of monitoring systems, have been evident in various pronouncements.

The National Industry Education Forum (1991) nominated as one goal for schools the development 'in all major curriculum areas of national curriculum statements and frameworks which will identify common learning tasks and agreed performance standards'; and as another goal the development of 'a comprehensive system of performance and accountability measures which will allow for valid and reliable assessment of student and teacher performance as a basis for national and international comparison'.

In pursuit of these goals, the Forum commissioned a paper on assessment and monitoring systems which analysed the inadequacies of available data, and described strategies for implementing national monitoring procedures (Masters, 1991). The Forum then elaborated strategies for achieving the goals (National Industry Education Forum, 1992). The Institute of Public Affairs, a think-tank with substantial business support, similarly promotes the introduction of national assessment 'preferably in Years 3, 6 and 9, to ensure that acceptable standards in English and mathematics are being attained, and to identify strengths and weaknesses at the individual, school and systemic level' (Kramer et al., 1992). In a forthcoming report commissioned by the Council of Australian Governments it is widely anticipated that the Industries Commission will recommend national assessment of some aspects of student performance.

Performance standards have been developed both a priori and a posteriori. The development of the national profiles represents an a priori approach in which the statements of standards to be achieved were formulated to express desired learning outcomes. Some monitoring programs, on the other hand, have developed definitions of performance standards a posteriori, following examination of the measured outcomes that students actually achieve.

One example of a posteriori standard setting is provided by an evaluation of literacy and numeracy levels in Victorian schools. This study was commissioned by the Minister of Education to identify how many students were completing schooling with inadequate skills, as well as to provide comparison of current with past levels of achievement. The survey was based on samples of Year 5 and 9 students and used for each level test keyed to the curriculum but with sufficient common items for all items at both levels to be calibrated onto a common scale (McGaw et al, 1989). The units on the scale, ranging from around 20 to around 60, were chosen arbitrarily so as not to use numbers of the type frequently used in educational testing in schools and related to number or percentage of items correct. Inspection of the specific content of items on the scale identified the level of minimally acceptable performance for adults as 35. The percentages of Year 5 and 9 students below this level were then estimated and reported, thus providing an answer to the Minister's first question - that is, how many students were completing schooling with adequate skills. By exposing the definition in this way, others are enabled, by examining the items on the scale, to nominate more or less generous definitions of minimum competence. But it necessitates that their definitions be as explicit.

In the New South Wales Basic Skills Testing Program a similar approach to scale calibration was taken, but a new form of presentation of results was developed to permit a verbal description of the performance levels of all students. From a detailed examination of the content of items, descriptions of performance bands on the scale were developed (Masters et al., 1990). Each student then received a personal report in which his or her performance was visually located on each of the five scales used. For each scale, the student was thus located within a performance band for which the description was printed at the foot of the report. Descriptions for all bands on all scales were printed on the back to indicate the range of performances represented on the scale. The location of the bands was influenced to some extent by normative considerations, since the bands were located in regions in which lay the performances of reasonable numbers of students at the year level. It is in precisely this sense that the setting of standards was a posteriori. This fitted the purpose of establishing a benchmark against which performances in subsequent years could be compared.

When the Western Australian Monitoring Standards in Education project commenced, the first national profiles and the Western Australian outcome statements had been developed so the monitoring program was keyed to the outcome statements (Titmanis et al., 1993). This approach permitted levels of achievement to be defined a priori in terms of the outcome statements; and allowed for the test items to be designed to tap particular levels. It also permitted a first attempt at empirical validation of the classification of outcomes to levels. Figure 4 shows the calibrations of some of the test items designed to assess outcomes at Levels 1 to 4 on the space strand of mathematics.

Figure 4

Figure 4: Some calibrated items from space strand in WA Monitoring Standards Project
(from Titmanis et al., 1993, p. 7)

The level of outcomes tapped by each item shown in Figure 4 is indicated by the first digit in the item number. From the calibrated location it is clear that item 1.6, designed as an item to tap a Level 1 outcome, is more difficult than most of those shown as tapping Level 2 outcomes. Apparent anomalies such as this may reveal a fault in the item or a misallocation of the outcome to the particular level. As more items are developed and used, the testing process 'will provide an enhanced definition of each level . . . will most likely result in some adjustment to the boundaries' (Titmanis et al., 1993, p.8) and will probably also suggest adjustment to the locations of some specific outcomes.

Work on the national profiles, and subsequent work on State and Territory variants to cover the compulsory years of schooling in Years 1 to 10, was informed primarily by prior curriculum development experience, and where available, research evidence on the developmental sequence of skill acquisition. These curriculum documents express goals and aspirations for schooling, moderated by experience of some of the participants in actually teaching students at various levels of schooling in the relevant learning area.

Empirical considerations of what students can achieve have been of secondary importance in the development of the curriculum profiles. They have had a stronger role, though, in some of the monitoring programs designed to establish benchmarks against which to judge student achievement and monitor changes in achievement levels.

Empirical validation of sequenced outcome statements can lead to adjustments to standards other than the technical ones indicated by misfitting items. Normative considerations can also justify adjustments to standards. If only a few students are able to achieve outcomes set as desirable for their stage of schooling, then the appropriate response may be to set these outcomes at a higher level. But where the outcomes are an expression of goals earnestly held, an alternative approach would be to retain them, and to develop strategies that make them achievable at this earlier stage. Similarly, if almost all students at a particular stage can achieve outcomes actually set for a higher level than they have generally reached, there may be grounds for moving those outcomes to a lower level.

Careful development of assessment procedures can also clarify and refine a set of outcome statements in another significant way. The need to develop assessment tasks along a continuum of achievement can require more precise definition of the continuum than is provided in statements such as those in the profiles, particularly in areas not so richly addressed in typical curriculum statements. 'Speaking and listening' - as one of the strands in the English profile - is a case in point. Using tasks such as taking notes from a set of recorded instructions from 'mother' and from an announcer at an assembly, Forster et al. (1994) have developed a protocol for scoring responses which - while it is based on the national profile for English (Curriculum Corporation, 1994a) produces a much more detailed continuum of skills. In doing so, it elaborates the original profile in terms of item locations, as shown in Figure 5.

Figure 5

Figure 5: Location of items on Listening Scale in DART English
(from Forster et al., 1994, p. 65)

Overall, recent Australian experience suggests that the best standard setting occurs when curriculum and assessment considerations are married. Curriculum considerations alone produce a priori prescriptions of outcomes to be achieved which may be unrealistic in their level or their sequence. Assessment considerations alone necessitate an a posteriori approach to standard setting which, in sifting the data of student performance, can conceptualise outcomes in ways that are not adequately linked to the curriculum. A curriculum framework specified in terms of desired student outcomes provides a structure to which assessments can be keyed so that the results provide a curriculum-linked evaluation of students and teaching.

References

Adams, R.J., Doig, B.A. & Rosier, M. (1991). Science learning in Victorian schools: 1990. Hawthorn, Vic.: Australian Council for Educational Research.

Board of Studies, Victoria. (1994). Curriculum and standards framework: Draft for consultation. Carlton: Board of Studies.

Bourke, S.F., Mills, J.M., Stanyon, J. & Holzer, F. (1981). Performance in literacy and numeracy: 1980. Canberra: Australian Education Council.

Collins, C. (1994). Curriculum and pseudo-science: Is the Australian national curriculum project built on credible foundations? Occasional Paper No. 2. Canberra: Australian Curriculum Studies Association.

Curriculum Corporation. (1994a). English - a curriculum profile for Australian schools. Carlton: Curriculum Corporation.

Curriculum Corporation. (1994b). A statement on English for Australian schools. Carlton: Curriculum Corporation.

Curriculum Corporation. (1994c). A statement on studies of society and environment for Australian schools. Carlton: Curriculum Corporation.

Dawkins, J. (1988). Strengthening Australia's schools: A consideration of the focus and content of schooling. Canberra: Australian Government Publishing Service.

Department of Employment, Education and Training. (1990). Australian national report on the development of education. International Conference on Education, 42nd Session, Geneva, 1990. Canberra: Australian Government Publishing Service.

Evaluation and Assessment Unit. (1993).1992 survey of basic numeracy skills of 10-year-old Tasmanian students. Hobart: Department of Education and the Arts.

Forster, M., Mendelovits, J. and Masters, G. (1994). Developmental assessment resource for teachers (DART) English. Melbourne: Australian Council for Educational Research.

Hill, P. (1994). Putting the national profiles to use. Unicorn, 20(2), 36-42.

House of Representatives Select Committee on Specific Learning Difficulties. (1976). Learning difficulties in children and adults. Canberra: Australian Government Publishing Service.

Keeves, J.P. (1992). Learning science in a changing world: Cross-national studies of science achievement 1970 to 1984. The Hague International Association for the Evaluation of Educational Achievement (IEA).

Keeves, J.P. and Bourke, S.F. (1976). Literacy and numeracy in Australian schools: A first report. Australian Studies in School Performance, Vol. 1. Canberra: Australian Government Publishing Service.

Kramer, L., Moore, S., and Baker, K. (1992). Educating Australians. Canberra: Institute of Public Affairs.

Lokan, J. and Ford, P. (1994). Mapping state testing programs. Melbourne: National Industry Education Forum.

Masters, G., Lokan, J., Doig, B., Khoo, S.K., Lindsey, J., Robinson, L., and Zammit, S., (1990). Profiles of learning. The Basic Skills Testing Program in New South Wales 1989. Hawthorn: Australian Council for Educational Research.

Masters, G.N. (1991). Assessing achievement in Australian schools. Melbourne: National Industry Education Forum.

McGaw, B., Long, M.G., and Rosier, M.J. (1989). Literacy and numeracy in Victorian schools: 1988. ACER Research Monograph No. 34. Hawthorn: Australian Council for Educational Research.

National Industry Education Forum (1991). Declaration of goals for Australia's schools. Melbourne: National Industry Education Forum.

Postlethwaite, T.N., and Wiley, D.E. (Eds) (1991). The IEA science study II: Science achievement in twenty-three countries. Oxford: Pergamon.

Quality Assurance Directorate (1994). Assessment of student performance 1993: Aspects of mathematics overall results. Brisbane: Department of Education.

Review and Evaluation Directorate (1993). Assessment of student performance 1992: Aspects of reading and writing overall results. Brisbane: Department of Education.

Rosier, M.J. and Banks, D.K. (1990). The scientific literacy of Australian students. ACER Research Monograph No. 39. Hawthorn, Victoria: Australian Council for Educational Research.

Titmanis, P., Murphy, F., Cook, J., Brady, K., and Brown, M. (1995). Profiles of student achievement: English and mathematics in Western Australian government schools l992. Perth: Ministry of Education.

Author details: Barry McGaw is Director of the Australian Council for Educational Research (ACER).

Please cite as: McGaw, B. (1994). Standards from a curriculum and assessment perspective. Queensland Researcher, 10(2), 1-18. http://education.curtin.edu.au/iier/qjer/qr10/mcgaw.html


[ Contents Vol 10, 1994 ] [ QJER Home ]
Created 5 Sep 2005. Last revision: 5 Sep 2005.
URL: http://education.curtin.edu.au/iier/qjer/qr10/mcgaw.html