Friday, December 30, 2011

Performance Measures for Australian Universities

Senator Chris Evans, Minister for Tertiary Education, Skills, Jobs and Workplace Relations issued three papers for comment on the performance measures for Australian universities, the "MyUniversity" web site (modeled on the MySchool website) , starting during 2012.The discussion papers for comment are (excerpts appended):
  1. Development of Performance Measurement Instruments in Higher Educationexploring the balance of performance measurement instruments, potential uses, deployment methods and participation and selection of students;
  2. Review of the Australian Graduate Survey (AGS) – examining the strategic position of the AGS in its relationship with other survey instruments, administration methods, timeliness and capacity to measure the diversity in student experience; and
  3. Assessment of Generic Skillsfocusing on the development of an instrument appropriate for Australian universities in the context of the OECD’s Assessment of Higher Education Learning Outcomes (AHELO) project.
The proposed Australian measures are about student satisfaction and academic results. They do not appear to cover ancillary costs of studying or job success after graduation, as the UK Metrics for Universities do.There could be considerable gaming of the figures (as has happened with "My School" website). For example, in the case of costs of studying, universities could make blended courses look cheap, by assuming that students will be distance education students. That is the university can say it has a wonderful range of on-campus facilities and accommodation, but assume the students never use them, when calculating the cost of study.Of course that might be an accurate view of modern student life. I am a postgraduate university student at the moment. Six weeks into the course I just submitted my first mid semester assignment on Friday. The university has excellent facilities for students, but I have never been to the campus, as it is more than a thousand kilometers away.

Development of Performance Measurement Instruments in Higher Education

Discussion Paper

December 2011

Table of contents

1.Introduction 3

1.1.Performance Funding 3

1.2.Advancing Quality in Higher Education 3

1.3.Purpose of this paper 3

2.Principles and the student life cycle framework 4

2.1.Principles 4

2.2.Student life cycle framework 4

3.Existing surveys 7

3.1.National and cross-institution surveys 7

3.2.Links with new instruments 7

4.New instruments 8

4.1.University Experience Survey 8

4.2.Assessment of Generic Skills : The Collegiate Learning Assessment 9

4.3.Review of the Australian Graduate Survey 9

5.Issues 12

5.1.Administration of new instruments 12

5.2.Student selection 12

5.3.Central sampling of students 13

5.4.Uses of data 14

5.5.Intersection of existing and new instruments 15

6.Next Steps 16

6.1. The AQHE Reference Group 16

6.2.Discussion papers 16

6.3.Roundtable discussions 16

6.4.Next steps 17

Appendix 1 - References 18

Appendix 2 – How to make a submission 19

Appendix 3 – Terms of Reference for the AQHE Reference Group 21

Appendix 4 – Membership of the AQHE Reference Group 22

Appendix 5 – Summary of existing surveys 23

Introduction

In 2008, the Government launched a major review to examine the future direction of the higher education sector, its fitness for purpose in meeting the needs of the Australian community and economy, and options for reform. The review was conducted by an independent expert panel, led by Emeritus Professor Denise Bradley AC. The panel reported its findings to the Government in the Review of Australian Higher Education (the Review) in December 2008. The Review made 46 recommendations to reshape Australia’s higher education system.

In the 2009-10 Budget, the Government responded to the recommendations of the Review with a ten-year plan to reform Australia’s higher education system, outlined in Transforming Australia’s Higher Education System. The Government’s response was based on the need to extend the reach and enhance the quality and performance of Australia’s higher education system to enable it to prosper into the future.

To extend reach, the Government holds an ambition to increase the educational attainment of the population such that by 2025, 40 percent of all 25-34 year olds will have a qualification at bachelor level or above. The Government also seeks to increase the higher education participation of those people who are currently underrepresented in higher education. In particular, by 2020, the Government expects that 20 per cent of higher education enrolments at undergraduate level will be of people from low socio-economic backgrounds.

Reforms announced to achieve these ambitions included the establishment of the Tertiary Education Quality and Standards Agency, the introduction of the demand driven funding system and significantly improved indexation on grants, Performance Funding and mission based Compacts.

The Government’s response to the Review shares a number of features with reform agendas in other areas of “human capital development” such as health, employment services, and disability services that are being implemented in Australia and internationally. These common features include: opportunities for citizens to exercise greater choice between alternative providers; the introduction of funding that “follows the consumer” and thus gives them more power in the service relationship and strengthens incentives for providers to tailor their offerings to citizens’ requirements; improved regulation to ensure minimum quality standards; and improved information on performance to allow citizens to make better informed choices.

The Government’s efforts to improve performance reporting and transparency are aimed at enhancing the quality of information available to students, to give them greater confidence that the choices they make are the right ones for them. The performance of universities has a number of domains, including but not limited to: research, teaching, financial performance, student experience, the quality of learning outcomes and access and equity. Each of these domains has a specific mechanism or tool (sometimes more than one) designed to capture relevant information about performance in that domain. For example, the Excellence in Research for Australia (ERA) process captures information about research performance; access and equity outcomes are captured through student data collections that include markers for low-SES status; and TEQSA will be implementing teaching standards by which the performance of universities will be measured.

Similarly, the three performance indicators that are the subject of this paper are designed to capture information about how universities perform in the domains of student experience and the quality of learning outcomes. There are likely to be synergies and complementarities with other tools, for example, TEQSA’s teaching standards. They therefore should be seen as part of an overarching suite of performance measures and mechanisms that are designed to capture information across the most relevant domains of university performance, necessary for improving the information available to students as they seek to exercise the choices that are now open to them in the demand-driven system. It should be noted that the newly created MyUniversity website will be used for presenting information to students about performance across the various domains.

Performance Funding

In late 2009, the Department convened an Indicator Development Group comprised of experts in the higher education sector. The group assisted in the development of a draft indicator framework, outlined in the discussion paper, An Indicator Framework for Higher Education Performance Funding, which was released for consultation in December 2009. The paper proposed 11 possible performance indicators in four performance categories. 61 submissions from the sector were received in response to the discussion paper.

The Government considered the feedback received and refined the framework to include seven indicators in three performance categories: participation and social inclusion, student experience and the quality of learning and teaching outcomes.

The Government released draft Performance Funding Guidelines for discussion in October 2010. The draft Guidelines provided details on the proposed implementation of the Performance Funding arrangements. The Government received 44 responses to the draft guidelines.

In the 2011-12 Mid Year Economic and Fiscal Outlook (MYEFO) the Government announced that it would retain Reward Funding for universities that meet participation and social inclusion targets. The Government discontinued Reward Funding for student experience and quality of learning outcomes indicators in the context of the Government’s fiscal strategy and on the basis of feedback from the sector that there was no consensus on the issue of whether it is appropriate to use such indicators for Reward Funding.

Universities have acknowledged the need to develop a suite of enhanced performance measures for providing assurance that universities are delivering quality higher education services at a time of rapid expansion. The Government will focus on the development of student experience and quality of learning outcomes indicators for use in the MyUniversity website and to inform continuous improvement by universities. The Government has agreed that three performance measurement instruments will be developed over the duration of the first Compact period: a new University Experience Survey (UES), an Australian version of the Collegiate Learning Assessment (CLA) and a Review of the Australian Graduate Survey (AGS). The Government has removed the composite Teaching Quality Indicator (TQI) from the performance indicator framework since universities have made a reasonable case that university performance is best measured using output rather than input indicators.

Final Facilitation Funding and Reward Funding Guidelines and Administrative and Technical Guidelines will be released on the Department’s website in December 2011. These will provide an outline of the final Performance Indicator Framework and Performance Funding arrangements.

Advancing Quality in Higher Education

In the 2011-12 Budget, the Government released details of its Advancing Quality in Higher Education (AQHE) initiative designed to assure and strengthen the quality of teaching and learning in higher education. The announcement provided more information on the new performance measurement instruments being developed and the available funding for developing the instruments. The consultation processes for the initiative were also outlined including the establishment of an AQHE Reference Group to advise on the development and cohesiveness of the performance measurement instruments. The AQHE Reference Group will also assist in the development of discussion papers for each of the instruments. Roundtable discussions with universities, business and students will also be held later in 2012.

Purpose of this paper

This paper discusses key issues in the design of the performance measurement instruments, assesses their fitness for purpose and their ability to operate together in a coherent way to obtain a comprehensive view of the student’s undergraduate university experience and learning outcomes. This paper does not describe in detail how the measurement instruments will be implemented - these issues will be outlined in separate discussion papers on each of the instruments.

The second section of this discussion paper considers some principles to guide the development of new performance measurement instruments. It also proposes that a framework of the student life cycle be used to situate the development of new performance measurement instruments. The third section briefly discusses existing survey instruments. The development of new performance measurement instruments for use in performance reporting and for other purposes is discussed in the fourth section. The fifth section considers key issues that impact on the coherence and balance of performance measures. The final section outlines a proposed implementation strategy for the new performance measures. ...

Issues

This section discusses key issues that arise from consideration of existing and new performance measurement instruments. This includes the administration of performance measurement instruments, student selection, central sampling of students, uses and the intersection of existing and new instruments. Discussion of these key issues facilitates an assessment of the fitness for purpose of performance measurement instruments and their ability to operate together in a coherent way to obtain as comprehensive view as possible about the student’s undergraduate university experience.

Administration of new instruments

Student surveys tend to be conducted in Australian universities using one of two broad deployment approaches (ACER, 2011, p.14), specifically:

  • an independent (or centralised) deployment, in which most if not all survey activities are conducted by an independent agency; or

  • a devolved deployment (or decentralised), in which institutions and a coordinating agency collaborate on survey operations.

An independent deployment approach involves participating universities providing the independent agency with a list of all students in the target sample at their institution, including student’s contact details. After receiving institutions’ population lists, the independent agency would identify the target population, which could be either a census or sample of students, and invite students to participate in the survey. Responses would then be returned directly to the independent agency for analysis.

A devolved approach involves participating universities supplying the independent agency with a de-identified student list that excludes student contact details. A sample of students would be drawn, online survey links would be allocated to student records, and this list would be sent back to universities who would then merge in student contact details. Under a devolved approach, universities manage the deployment of the survey by sending invitations to sampled students and following up with non-respondents. Responses are provided directly to the independent agency for analysis.

Both deployment approaches have benefits and limitations. A devolved approach has benefits in that it can accommodate the needs and circumstances of a diverse array of universities. On the other hand, the fact that universities are primarily responsible for collecting data about their own performance can be seen as a conflict of interest, leading to perceptions that universities may ‘game’ the system. Given the stakes and uses to which the data collected from the new instruments will be used, on balance, an independent approach is favoured since this will promote validity, consistency and efficiency.

A key issue for any independently deployed survey is privacy and the responsibility of universities (and the Department) to preserve the confidentiality of student information they hold. Universities may be required to amend their agreements with students to permit disclosure of personal information to third parties for the purposes of conducting surveys. Providing privacy laws are satisfied in the development of an instrument, this approach has to date received broad support from the higher education sector, as measured through consultation in the development of the UES.

While the deployment approach is a significant consideration in terms of administration of a new survey instrument, there are other issues which should be considered. These include, but are not limited to, the administration method (online, telephone, paper-based), context and timing. These issues will be considered in the context of individual instruments.

Questions for Discussion

  • What concerns arise from an independent deployment method?

  • What are the obstacles for universities in providing student details (such as email address, first name and phone numbers) to an independent third party?

  • Would universities agree to change their privacy agreements with their students to permit disclosure of personal information to third parties for the purposes of undertaking surveys?

  • What are the other important issues associated with administration of survey instruments?

Student selection

The selection of students for the new performance measurement surveys is an issue which has raised a number of concerns. The burden of the existing range of survey instruments on both students and university resources is foremost in the minds of universities and therefore a balance needs to be struck between the need for the collection of new data for the purposes of performance reporting and to assure quality and the requirements on students and universities.

The surveys could be run as a census of all students in scope or by administering the survey to a sample of the students in scope. Deciding between a census and a sample is a complex process that necessarily takes into account many technical, practical and contextual factors.

Two major issues are non-response biases and general confidence in the precision of results, particularly at the sub-institutional level. The advantage of a sample survey approach is that a structured sample can be constructed that deliberately focuses on target groups for which it is necessary to generate meaningful results (for example at the course level or for particular demographic groups) and this may assist in overcoming non-response biases. A sample survey approach would require a relatively sophisticated sampling frame to give adequate coverage across fields of education and demographic characteristics. This process would be simplified if the Higher Education Information Management System (HEIMS) database could be used to construct the sample frame, given that it already records detailed information on student characteristics. Standard techniques to measure the precision of sample survey results could be systematically applied across all results, for example, calculating confidence intervals or standard errors.

On the other hand, given the small student populations sometimes under consideration (for example, courses where only a small number of students are enrolled at a particular institution), sample sizes needed to provide confidence in survey results would approach the total population. The intention to publish data from the new performance measurement instruments on the MyUniversity website disaggregated to subject level may influence the decision on whether to conduct a census or survey since a sufficiently large number of responses will be required to ensure data are suitably robust and reliable. In this case, it may be preferable to continue on a ‘census’ basis where the whole population is approached to participate.

Other issues for consideration in deciding between a census or survey approach include:

  • Support by participating institutions;

  • The size and characteristics of the population;

  • Providing students with opportunities for feedback;

  • Relationship with other data collections, in particular other student surveys;

  • Analytical and reporting goals, in particular sub-group breakdowns;

  • Anticipated response rates and data yield;

  • Consistency and transparency across institutions;

  • Cost/efficiency of data collection processes; and

  • The availability of supplementary data for weighting and verification.

The method of student selection may vary between instruments, and regardless of whether a census or sample approach is used, proper statistical procedures will be used to evaluate the quality and level of response in the long term.

Questions for Discussion

  • What are key considerations in choosing between a sample or census approach to collection of performance data?

Central sampling of students

As discussed above, an important issue regarding the introduction of new surveys within the higher education sector is the perceived burden on university resources and the students who are required to participate.

One method which has been proposed to assist in the reduction of this burden is the possibility of using DEEWR HEIMS (Higher Education Information Management System) data to better control student sampling and also to use stored student demographic level data to pre-populate survey questions where appropriate.

Using the HEIMS data in this way could potentially improve random sampling and avoid oversampling of students invited to participate in surveys, while also reducing the number of questions students are required to answer per survey through the ability to pre-fill and skip questions where data is already available. In addition, by having the Department involved at this level in the survey process, this could improve perceptions of the overall integrity of surveys through making clear that samples are independently constructed.

Note there are restrictions regarding the use of HEIMS data in the Higher Education Support Act. The Department, therefore, is investigating options for the use of this data in the context of individual performance measurement instruments and also as a suite of instruments.

Questions for Discussion

  • What are the advantages and disadvantages of central sampling of students?

Uses of data

The collection of data from the new performance measurement instruments will be of major benefit to universities for continuous improvement and to assure the quality of teaching and learning in the sector. The Government has also indicated that, subject to their successful trial and implementation, it is intended that data from the new instruments will be published on the MyUniversity website.

While a number of uses are proposed for the new data collections, other uses may be discovered throughout the development of the new instruments. It will be important, therefore, to consider potential future uses of the data while being aware of the potential for misuse of the data.

By utilising the principles referred to in section 2.1 of this paper in the development of the instruments, the data available should be relevant, reliable, auditable, transparent and timely. These principles will also provide guidance as to the appropriate uses of the data. Further, the range of consultations being undertaken in the development of the new instruments will include roundtables where stakeholders will be able to discuss how the results from the new instruments can be used while raising any concerns regarding the future use of the data. In addition, it may be appropriate to establish codes of practice that guide appropriate uses and interpretation of performance information.

Continuous improvement

It will be important that the data collected from the performance measurement instruments is useful for universities, so that they can implement processes and policies for continuous improvement and maintain a high level of quality of teaching and learning.

To ensure the performance measurement instruments collect data that is useful and relevant to universities, the instruments are being developed with significant sector consultation. Stakeholders will be invited to provide feedback throughout the development of the new instruments to allow these to take account of their data and measurement needs.

Using the student life cycle model, consideration needs to be given to what information can and should be collected from the new instruments given their implementation at different stages of the life cycle, and how this information will assist to assure the quality of teaching and learning in Australian universities.


MyUniversity

It is expected that in future releases, the MyUniversity website may include performance data from the new performance measurement instruments. This would allow prospective students additional information with which they can assess the performance and quality of different institutions in the three performance categories. How the data will presented on the MyUniversity website will be a consideration once the instruments have been tested and the level of data analysis is known.

Information regarding the use of performance data on the MyUniversity website will be made available throughout the development process for the instruments and the website itself.

Another issue that arises in consideration of the MyUniversity website is the level of reporting. A major purpose of the website is to inform student choice about courses and subjects. In this environment, more detailed reporting is likely to be desired, for example, at the field of education level. There is likely to be a trade off between collection and reporting of data from the new performance measurement instruments at a finer level of disaggregation and adding to the complexity and burden of reporting.

Questions for Discussion

  • What are appropriate uses of the data collected from the new performance measurement instruments?

Intersection of existing and new instruments

The development of the University Experience Survey (UES) has raised the issue of whether the new instruments should be focused instruments for the purposes of performance reporting, or whether they could potentially be expanded to replace existing surveys and institution/course specific questionnaires. For example, what is the potential overlap between the newly developed University Experience Survey and the Course Experience Questionnaire in measuring student experience?

This will be a consideration in the development of all new instruments to ensure there is balance between the additional burden on both students and universities of the new instruments and ensuring they are able to capture targeted and purposeful information. Further, there needs to be consideration of the data needs of individual universities and how these differ across the sector.

A key issue in considering the overlap between the new instruments and existing survey instruments is the uses to which performance measurement data will be put as discussed above. It is expected students will benefit from the availability of new data via the MyUniversity website. The new performance measurement instruments will potentially be used to enhance continuous improvement processes within universities. There is also the potential for international benchmarking.

A major dilemma is the need for broad level national data and the requirement for more detailed data to suit universities’ diverse needs and missions. Satisfying both of these goals raises the important issue of costs and resource burden. One suggested solution for the UES is that there could be a core set of items which would be asked of students at all universities, and an optional set of non-core items which universities could select to suit their individual requirements.

Ultimately, universities will decide in which instruments they participate, and this will hinge on a range of factors not limited to the type of data collected. By considering, however, the range of existing surveys, what data is most useful to universities, and what additional data universities would like to collect from the new performance measurement instruments, the development of the new instruments has the potential to make this decision by universities considerably easier.

Questions for Discussion

  • Are there other issues that arise when considering the overlap of existing and new instruments?

...


From: Development of Performance Measurement Instruments in Higher Education, Department of Education, Employment and Workplace Relation, 9 December 2011

Review of the Australian Graduate Survey

Discussion Paper

December 2011

Table of Contents

1.Introduction 3

1.1.Policy context 3

1.2.Consultation 3

2.Principles and the student life cycle framework 4

2.1.Principles 4

2.2.Student life cycle framework 4

3.Strategic position, role and purpose of the AGS 5

3.1.Overview of the AGS 5

3.2.Role and purpose of the CEQ 5

3.3.Role and purpose of the GDS 6

3.4.Future strategic position of the AGS 6

4.Administration issues 7

4.1.Administrative model 7

4.2.Timeliness 7

4.3.Funding 7

5.Survey methodology and data quality issues 8

5.1.Methodology and standardisation 8

5.2.Data quality 8

6.Aspects of student experience 10

7.Next steps 11

Appendix 1 – References 12

Appendix 2 – How to make a submission 13

Appendix 3 – Current AGS survey instrument 15

...

Consultation

The Australian Graduate Survey (AGS) is a national survey of newly qualified higher education graduates, conducted annually by Graduate Careers Australia (GCA). A strengthened AGS is part of the suite of performance measurement instruments that were announced as part of the AQHE initiative. The Department of Education, Employment and Workplace Relations (DEEWR) is working with GCA and the higher education sector to review the AGS. The review is examining the strategic position of the survey, and aims to improve the survey content, data collection methods and timeliness of reporting. The review is also considering how to better capture aspects of student experience for external, Indigenous, international and low socio-economic status students.

Consultation for the AQHE initiative includes the establishment of an AQHE Reference Group to advise on the cohesiveness of the three instruments, and the development of an overarching discussion paper, Development of Performance Measurement Instruments in Higher Education. In addition, the AQHE Reference Group will assist in the development of discussion papers on each of the instruments. Consultations and roundtable discussions with universities, business and students will also be held later in 2011 and in 2012.

The Department has prepared this discussion paper based on consultation with and advice from the AQHE Reference Group. The paper raises issues and options for the future of the AGS, with the aim of canvassing views from universities and other stakeholders in the sector. Information on how to contribute to the process can be found below. ...

Future strategic position of the AGS

The Government announced as part of the Advancing Quality in Higher Education initiative the development of a suite of performance indicators for the higher education sector (a summary of the new indicators can be found in the Development of Performance Measurement Instruments in Higher Education discussion paper). These indicators will provide greatly enhanced information on university performance. At the same time, the sector is moving towards a student centred funding model. To assist students in making informed decisions about their tertiary education, it is intended that the MyUniversity website will from 2013 include detailed results from the new performance indicators. Higher education sector stakeholders, including institutions, students and Government, will therefore be responding dynamically to a greater range of information sources than has previously been available. Importantly in this respect, institutional level and institution by field of education level data will be made public. While this does not accord with current AGS practice, it is consistent with approaches to publishing performance information previously undertaken by the Department.1

This changed environment presents a major challenge to the ongoing relevance and strategic position of the AGS. From being the prime source of nationally benchmarked data on university performance, the AGS will become one of several available data sources. In this context, the ongoing role and value of the AGS needs to be clearly articulated. The AGS may need to be modified to enable the survey to establish a coherent place among the range of new indicators, and to ensure it continues to meet the evolving needs of higher education sector stakeholders.

Given the increasing number of surveys in which university students are being asked to participate, and for which universities are being asked to provide administrative support, the additional value offered by the AGS needs to be clearly articulated. One option to reduce cost and respondent burden would be to move from the current census basis of the AGS, where all eligible students are invited to participate, to a survey sample. This question also has ramifications for data quality, as discussed below.

Consideration should also be given as to whether the CEQ should move to surveying students, rather than graduates, in line with the other performance indicators being developed. The CEQ was originally developed and tested for use with undergraduate students in the United Kingdom. In Australia, however, it has always been administered to graduates, which may lead respondents to focus on overall course experience (as intended), rather than specific subjects or instructors. Surveying graduates has also allowed the CEQ to be administered simultaneously with the GDS. Conceptually, however, the CEQ measures satisfaction across the whole of the student lifecycle, and there is no inherent reason why this need take place after graduation.

The most notable challenge to the ongoing relevance of the CEQ comes from the new University Experience Survey (UES). The UES will gauge student attitudes towards a number of aspects of their university course, initially at the end of their first year and potentially in their final year of study. The UES will measure aspects of student’s university experience associated with high level learning outcomes such as teaching and support, student engagement and educational development. While not identical to the information garnered by the CEQ, the UES will provide an alternative measure of student satisfaction and course experience perceptions across the student lifecycle. Consideration needs to be given to the value of continuing the CEQ as an additional survey instrument.

Information provided by the GDS will not be replicated by any of the new performance indicators. By its nature, the GDS is a measure of a university’s contribution to skill formation in relation to labour market outcomes and can only be administered at the end of the student lifecycle. Information on graduate outcomes will continue to be of value to the sector. Nonetheless, consideration should be given as to whether the GDS as currently configured is appropriate for the needs of the sector in the future.


Questions for Discussion

  • Is joint administration of the GDS and CEQ under the AGS still appropriate?

  • Will the GDS and CEQ adequately meet future needs for information in the student driven environment?

  • Should the basis of the AGS be modified to improve fit with other indicators or to reduce student burden? Would a survey sample be a more appropriate option? What are the implications for the development of the UES for the CEQ?

...

Funding

The AGS is primarily funded by direct grant from DEEWR to GCA, made under the Higher Education Support Act. In 2011, this grant was around $660,000. Funding is also sourced from Universities Australia and from subscriptions paid by individual universities. In addition, universities provide substantial in kind funding by administering the survey instrument. In recent years, GCA has incurred financial losses in administering the AGS, and additional funding would likely need to be found if current administrative arrangements are to continue. ...

Questions for Discussion

  • Is the current partially decentralised mode of delivering the AGS still appropriate?

  • How can the timeliness of AGS reporting be improved?

  • Are current funding arrangements for the AGS appropriate? What alternative funding arrangements should be considered?

...

Data quality

Consideration should also be given to the broader data quality issues. Conceptually, the AGS currently operates on a census basis, in that all eligible graduates are invited to respond. GCA procedures mandate that institutions achieve at least a 50 per cent overall response rate. For the 2010 AGS the national response rate was 53 per cent for the CEQ and 57 per cent for the GDS.1 This is a high response rate compared with other surveys of university students, but is still low enough to raise questions as to data reliability. The two main issues are non-response biases and general confidence in the precision of results, particularly at the sub-institutional level. ...


Questions for Discussion

  • Will AGS data continue to be reliable enough to meet the needs of the sector in the future? How can data reliability best be improved?

  • Would moving the AGS to a centralised administrative model improve confidence in results?

  • Would moving the AGS to a sample survey basis improve data quality?

Aspects of student experience

...


Questions for Discussion

  • Does the AGS adequately measure the diversity of the graduate population and how might it be strengthened in this regard?



1 Graduate Careers Australia, 2011(a), p. 2; Graduate Careers Australia, 2011(b), p. 3.


2 Graduate Careers Australia, 2006, pp. 61-92.


1 Department of Education, Training and Youth Affairs (1998); Department of Education, Science and Training (2001).


From: Review of the Australian Graduate Survey, Department of Education, Employment and Workplace Relation, 9 December 2011

Assessment of Generic Skills

Discussion Paper

December 2011

Table of Contents

1.Introduction 3

2.Policy context 3

2.Policy context 3

3.Consultation 3

3.Consultation 3

4.Principles and the student life cycle framework 4

5.Principles 4

5.Principles 4

6.Student life cycle framework 4

6.Student life cycle framework 4

7.Purpose 5

8.Direct Assessment of learning outcomes 5

8.Direct Assessment of learning outcomes 5

9.Uses 5

9.Uses 5

10.AHELO – progress report 7

11.Outline 7

11.Outline 7

12.Generic Skills 7

12.Generic Skills 7

13.Progress 7

13.Progress 7

14.Australia’s participation 7

14.Australia’s participation 7

15.Issues 9

16.Quality assurance framework 9

16.Quality assurance framework 9

17.Discipline-specific assessments 9

17.Discipline-specific assessments 9

18.Measurement 10

18.Measurement 10

19.Participation 12

19.Participation 12

20.Next steps 13

...

Purpose

Direct Assessment of learning outcomes

Direct assessment of learning outcomes represents the ‘holy grail’ of educational measurement. Objective data on student outcomes provides direct evidence that higher education is meeting economic, social and community needs. Knowledge of what students have learned and achieved and that they have attained the expected outcomes of their degrees provides assurance about the quality of higher education.

External assessment of student’s cognitive learning outcomes, at least in the higher education environment, is rare and to date has been difficult to achieve. In the schools sector, the Programme for International Student Assessment (PISA) is probably the most notable example on an international scale of the measurement of reading, mathematical and scientific literacy outcomes. Alternative measures of learning outcomes in the higher education sector, such as employment and further study, are thought to be problematic because they may be confounded by factors such as the influence of field of education and differences in state/regional labour markets. In the absence of robust direct assessment of outcomes, reliance is frequently placed instead on student self-reports, for example, as measured through the Course Experience Questionnaire (CEQ) generic skills scale which records student perceptions of their achievement of generic skills.

The Bradley Review of Higher Education argued that, “Australia must enhance its capacity to demonstrate outcomes and standards in higher education if it is to remain internationally competitive and implement a demand-driven funding model.” (DEEWR, 2008, p. 128). As part of the new quality assurance framework, Recommendation 23 of the Bradley Review of Higher Education proposed:

“That the Australian Government commission and appropriately fund work on ...........

      • a set of indicators and instruments to directly assess and compare learning outcomes; and

      • a set of formal statements of academic standards by discipline along with processes for applying those standards.”

Uses

Direct assessment of learning outcomes has many uses and benefits including providing assurance about the quality of higher education, encouraging continuous improvement among universities, meeting employer needs for more skilled graduates and informing student choice.

Quality assurance

External assessment and reporting of the attainment of generic skills provides assurance about the quality of graduates from the higher education system. With significant public and student investment in higher education, the community is entitled to understand that graduates have acquired the skills expected of them when they have completed their degree. The performance indicator framework proposes that the Collegiate Learning Assessment, or some variant of the instrument, be developed for use as an indicator of the acquisition of generic skills. A key principle in the design of performance measures is that they be ‘fit for purpose’, and it is intended that an instrument assessing generic skills be designed that is capable of being used for performance reporting.

Continuous improvement

Assessment of learning outcomes offers the prospect of a virtuous circle whereby assessment and reporting inform improved teaching and learning, in turn leading to improved assessment and reporting of learning outcomes. For example, the Collegiate Learning Assessment offers assessment tools and additionally resources to support improvement of curriculum and pedagogy to promote the development of generic skills. The Council for Aid to Education (CAE) in administering the Collegiate Learning Assessment also conducts professional development activities that train staff in in the process of creating better teaching tools (and classroom-level measurement tools) aimed at enhancing student acquisition of generic skills.

Arguably, the process of teaching and learning generic skills is more effective if undertaken at discipline rather than university level. The issue of the appropriateness of university and/or discipline-specific assessments will be addressed in more detail below. In part, the focus on discipline follows the recommendation of the Bradley Review of Higher Education of the need for more work to be undertaken on academic standards by discipline and processes for applying those standards.

Employer needs

To be successful in the workplace, graduates must acquire generic skills that enable them to fully utilise their discipline-specific knowledge and technical capabilities. As suggested by the Business Council of Australia:

“The challenges involved in adapting to new and changing workplaces also require effective generic skills. Generic skills including communication, teamwork, problem solving, critical thinking, technology and organisational skills have become increasingly important in all workplaces.” (BCA, 2011, p.8)

Reliable empirical studies of employer needs and satisfaction with graduates are few and far between which is both unfortunate and somewhat surprising given the strongly vocational orientation of much of Australian higher education. An earlier study of employers found that the skills in which new graduates appear most deficient are the generic skills of problem solving, oral business communication skills and interpersonal skills with other staff (DETYA, 2000, p.22). The measure of skill deficiencies used in this study was the gap between employer ratings of the importance of skills and their ratings of graduate abilities in these skills. A study conducted by the University of South Australia gave broadly similar findings about employer demand for graduate skills and capabilities (UniSA, 2008).

Informing student choice

Greater transparency will inform student choice in a demand-driven funding model of higher education. The Collegiate Learning Assessment is part of a suite of performance measurement instruments designed to improve transparency in university performance. Subject to the successful development and trial of the Collegiate Learning Assessment, it is intended that universities results on this performance indicator will be published on the MyUniversity website from 2013 onwards.

Another issue that arises in consideration of the MyUniversity website is the level of reporting. A major purpose of the website is to inform student choice about courses and subjects. In this environment, it would be appropriate to report information at the discipline/field of education level as well as the institution level. This is another factor that impacts on the development of an appropriate assessment of generic skills. The issue of the assessment of discipline-specific generic skills is discussed in more detail below.


Questions for Discussion

  • Are there other uses of the assessment of generic skills?

...

Issues

...

Questions for Discussion

  • Which criteria should guide the inclusion of discipline specific assessments in the development of a broader assessment of generic skills?

  • Are there other criteria, not listed above, which need to be considered?

...

Questions for Discussion

  • What factors should guide the design of performance measurement instruments to assess generic skills?

...

Questions for Discussion

  • Is value-add an appropriate measure of generic skills?

  • Is it necessary to adjust measures of generic skills for entry intake and how should this be done?

  • Are there other more appropriate measures of generic skills?

...

Questions for Discussion

  • What would be an appropriate measure of generic skills for reporting university performance?

...

Questions for Discussion

  • What level of student participation is desirable and for what purposes?

  • What design features or incentives would encourage student participation?

...

From: Assessment of Generic Skills, Department of Education, Employment and Workplace Relation, 9 December 2011

Thursday, December 29, 2011

Certified Electronic University Results

The Australian National University started issuing certified electronic Academic Transcripts, Australian Higher Education Graduation Statements and Testamurs from mid 2010. This is via the Digitary service based at Dublin City University. Instead of providing a certified paper copy or a scan of it, students can provide a hypertext link to the document in the on-line service. Copies printed from the service have instructions on them of how to verify the details.

This is a much more secure and convenient system, than trying to verify a paper copy of a document, or a facsimile of it. A university can provide such a service directly, via its own web site, but using a service shared by other universities (including the University of Cambridge and the London School of Economics) increases the credibility of the documents.

Link

Tuesday, December 27, 2011

F-35B STOVL Fighter for Australian Sea Control Ship

This is to suggest the Australian Government order 24 F-35B Lightning II STOVL aircraft to equip the HMAS Canberra and HMAS Adelaide. The Canberra and Adelaide are currently described as Landing Helicopter Dock (LHD) ships, operating only helicopters from the flight deck. However, the Australian government ordered the ships fitted with ski-jump ramps intended for Short Takeoff - Vertical Landing (STOVL) fixed wing aircraft. With these aircraft, the ships will then be able to carry out their primary mission, of amphibious operations. Without the protection of aircraft, the ships will be limited to operating close to the Australian coast.



The F-35B is the highest technical rusk, most expensive and lowest performance of the three F-35 variants. Currently the Australian government is considering ordering the conventional takeoff F-35A, which is less risky, cheaper and higher performance. However, the F-35B offers the ability to operate from a ship's deck and retains the stealth characteristics and a useful payload. The extra cost of the F-35B has in effect been underwritten, by the other variants and has performed well in recent shipboard trials.

Risks remain with the F-35 project. Australia could order 24 F-35Bs, sufficient for the two ships. If the F-35 turns out to be successful, then F-35As could be ordered later for land-based use. Otherwise more F/A-18Fs could be ordered for land use and just the 24 F-35Bs retained for shipboard use.LinkLink

Canberra Schools Connected Learning Community

The ACT Department of Education and Training is providing a web based service called the "Connected Learning Community" (CLC). This appears to be based on the Atomic Learning products. Unfortunately neither the ACT Government or company web sites provide details of what the CLC is, or does. In a media release last year Andrew Barr MLA (New portal gives students better access to schools, 25/05/2010) wropte that this would give the students " ... the ability to listen to past lessons as podcasts, videolink with other students for language practice and check on their homework requirements" it also said it would allow them to complete their maths homework on-line and "Students will be able to learn anywhere, any time.”

Such a facility would be very useful but does present challenges. This will require re-training of teachiners,. not only in how to use such a computer system, but also how to plan and deliver lessons on-line, which requires different pedagogy. Delivering old fashoned education via such a system would be a waste of the money invested (which will need to be much more than the $20M so far allocated). It would also cause frustration for the students, teachers and the parents.

Allowing students to learn anywhere will also require changes to the current laws governing school attendance. Most students will be able to complete most of their academic subjects without attendance. This would reduce the requirement for attendance down to perhaps two days a week for sport and group activities. Schools could still provide student supervision in "learning centers" (the modern term for a school library), overseen by teachers. But this would not need to be compulsory, as students academic development could equally well be supervised via the system by their teachers remotely. This will make most current school buildings obsolete, as few individual classrooms will be needed.

One issue with the CLC, is why it is limited to public schools. All Australian students are entitled to an education paid for by the state. There is no reason why the same on-line system should not therefore be available, at no cost, to students at private ACT schools.

Commercial UAV Used for Anti-whaling Surveillance

The Sea Shepard anti-whaling ship is reported to have launched a miniature robot aircraft ("Drone puts sting in whale war", MERYL NAIDOO, Hobart Mercury, December 26, 2011 12.00am). Speculation on the DIY Drones Website suggests this from company Attopilot. The largest UAV they sell is the Jackaroo, with a 1.5M wingspan. This only has a 25km range, but if flown over the ship to its ceiling of 3.5km, would provide a view out to 200km. This is far beyond the range of the ship's radar. There is no indication of what sensors are carried, but presumably it is a camera.

Monday, December 26, 2011

CSIRO Requires a Modular Data Centre

CSIRO have issued a Request for Tender for a Modular Data Centre to be installed in Melbourne by 31 May 2012. The leading contender for this would be APC with their close-coupled cooling, as used at Canberra Data Centre. But any supplier will have difficulty providing a turnkey data centre within a few months. Link
ATM ID: CSIRORFT2011-064
Agency: CSIRO
Category: 43190000 - Communications Devices and Accessories
Close Date & Time: 27-Jan-2012 2:00 pm (ACT Local time) ...

Description:

CSIRO is seeking an innovative Data Centre solution that can readily adapt to dynamic Data Centre space, power and cooling requirements over the life of the facility. The overall solution must provide a complete turnkey solution inclusive of all site infrastructures necessary to function as a standalone Data Centre and will be assessed on its ability to be delivered as a prefabricated, preassembled and pretested suite of modules. ...

From: Request for Tender for a Modular Data Centre, CSIRO, 22-Dec-2011

Friday, December 23, 2011

ICT at the Rio+20 UN Conference on Sustainable Development

The Rio+20 UN Conference on Sustainable Development is 20 to 22 June 2012 in Rio de Janeiro, Brazil. Here are some of the documents from the conference web site which mention us of ICT:
  1. Using ICTs to tackle Climate Change: International Telecommunication Union; Global e-Sustainability Initiative
  2. ICT as an Enabler for Smart Water Management: International Telecommunication Union
  3. ICTs for e-Environment: International Telecommunication Union
This is a follow-up on the 1992 United Nations Conference on Environment and Development (UNCED). That conference achieved very little, but the UN does not seemed to have learned from that failure and continues to hold the same type of last-century expensive talk-shops.

ICT features as a development enabler in several of the documents provided for the conference (particularly Robert Pollard's). It is unfortunate that the UN has not thought to use ICT to improve the process for such events. It makes very little sense to fly people from around the world to sit and listen to some rich person tell them that poor people need a voice. It is time the UN used ICT to reform its own processes, to make them more efficient and more open. A simple way to do that would be to shrink the size of the Rio+20 conference to just a small PR unit of perhaps ten people for publicity purposes, with the delegates remaining at home and communicating on-line. Anyone else who wanted to contribute could also do so on-line.

Thursday, December 22, 2011

New learning space and teaching techniques improve student results

In "Pedagogy and Space: Empirical Research on New Learning Environments" Walker, Brooks, and Baepler report that new cabaret style rooms and student centered learning are popular with students and improve their results, but take some getting used to. They found the new room design changes the instructor's behavior, even when the instructor was not trying to teach differently. This suggests that university administrations could proceed to build new style rooms and so encourage new style teaching in them. The research found that the new student centered learning also improved results, even without the new style rooms. But the university may find it easier, cheaper and quicker to change the room designs than teacher habits.

The University of Minnesota's Sciences Teaching Student Services Building is equipped with cabaret style rooms typically for 117 students each. These rooms have a flat floor with a teaching podium in the center, circular tables for groups of nine students and projections screens on the four walls.

This TEAL (Technology Enabled Active Learning) style room was used by MIT in the mid 2000s. More recently a style of room more like a traditional classroom has become popular. This retains the student groups at desks, but places the instructor's podium and most of the screens at one end of the room. The room is wedge shaped and the desks (rectangular, oval or wedge shaped) are orientated so most students can see the main wall (with supplementary screens on other walls). This design allows for a more traditional presentation, with the students all looking in the one direction at the presenter, providing more of a group focus. Also presenters are not distracted by having some of the audience behind their back. However, that design can be an uncomfortable compromise.

Reduce energy use and maintain quality of life

The NSW Independent Pricing and Regulatory Commission released a Research Report - Determinants of residential energy and water consumption in Sydney and surrounds along with a Fact Sheet and Consumption Comparator. This says that detached houses with one or two people have higher energy bills. Also low-income households tend to be in such houses proportionally increasing their energy bills. The report suggests that energy use can be reduced by replacing resistance electric hot water systems, disconnecting a second fridge (beer fridge), using clothes dryers and
air conditioners less. Both water and energy use can be reduced with low-flow showerheads and tap aerators.

This is all sensible advice. However, Australia has a stock of large inefficient detached houses designed for much larger families than currently occupy them. It would take many decades for these to be replaced with smaller, more efficient housing, even if there were policies in place to make that happen. One solution would be to divide detached housing to make two or three town houses. Alother would be policies to encourge people to share larger houses.

Some policies are in conflict. As an example, resistance hot water systems are now banned for new homes and as replacement in older ones. But new energy efficient hot water systems have a high capital cost, placing them out of reach for low income families. A simple alternative would be to use a smaller resistive electric hot water system. Small electric hot water systems are inexpensive and more efficient than large models, but are now banned.

Also small hot water systems can be placed in the hose near the kitchen and bathroom, resulting in less heat lost in the plumbing and less water loss when the tap is let running waiting for the hot water. Large units have to be placed outside the home, with longer pipe runs, more heat and water wasted.
Contents

Introduction and executive summary 9
1.1 We specified 2 types of model - ‘characteristics’ models and ‘uses’ models 9
1.2 Overview of results for the ‘characteristics’ models 11
1.3 Overview of results for the ‘energy uses’ models 11
1.4 Applying the ‘energy uses’ models to households with the same incomes and the same numbers of occupants
1.5 Policy implications: how to reduce energy consumption while maintaining the quality of life
1.6 Overview of results for the ‘water uses’ model
1.7 The Consumption Comparator
1.8 The structure of the report

2 How we analysed the household survey data 12
2.1 We used regression models to analyse the household survey data 12
2.2 We specified 2 types of model - ‘characteristics’ models and ‘uses’ models 14
2.3 How we present the results 14
2.4 Some of the terms used in this report 15

3 The determinants of electricity consumption 17
3.1 The context: average and median electricity consumption in NSW 17
3.2 The relationship between household characteristics and electricity consumption - the ‘characteristics’ model
3.3 How household characteristics are associated with different uses for electricity
3.4 The relationship between electricity consumption and what it is used for - the ‘energy uses’ model
3.5 How well our ‘energy uses’ model predicts electricity consumption

4 The determinants of gas consumption 34
4.1 The characteristics of households that use gas 35
4.2 The relationship between household characteristics and gas consumption - the ‘characteristics’ model
4.3 The amount of gas used for and cooking, heating and hot water
4.4 The relationship between gas consumption and what it is used for - the 'energy uses’ model
4.5 How well our ‘energy uses’ model predicts gas consumption

5 The determinants of energy usage bills
5.1 Why we used energy bills to estimate consumption
5.2 The relationship between household characteristics and energy bills - the ‘characteristics’ model
5.3 The relationship between energy bills and what energy is used for - the ‘energy uses’ model
5.4 The impact on energy bills and electricity consumption of having a Controlled Load electricity supply
5.5 The impact on energy bills of using gas for hot water and space heating

6 The determinants of energy consumption by income and number of occupants
6.1 The relationship between energy consumption and what it is used for by income group - the ‘energy uses’ model
6.2 The relationship between energy consumption and what it is used for by household size - the ‘energy uses’ model
6.3 Policy implications: how to reduce energy consumption while maintaining the quality of life

7 The determinants of water consumption
7.1 The context: water supply conditions and average consumption in the survey areas
7.2 The relationship between household characteristics and water consumption - the ‘characteristics’ model
7.3 How household characteristics are associated with different uses for water
7.4 The relationship between water consumption and what it is used for - the ‘water uses’ model
7.5 How well our ‘water uses’ model predicts water consumption
7.6 The impact of dwelling type on water consumption

Appendices
A Information about the surveyed households
B Information about our regression analysis
C Detailed regressions results for gas
D Detailed regressions results for energy
E Detailed regressions results by household income and number of occupant 160
F Detailed regressions results for water
G Technical information for electricity and water
H Detailed regression results for electricity

Glossary ...

From: Research Report - Determinants of residential energy and water consumption in Sydney and surrounds , NSW Independent Pricing and Regulatory Commission, December 2011

Wednesday, December 21, 2011

ANU New Learning Space

Greetings from the new teaching and learning spaces in the Hancock Building at the Australian National University in Canberra. While new science buildings with hexagonal windows are the visible sign of change in the western science precinct at ANU, a less visible revolution in education has taken place inside the old Hancock Library. The rows of grey steel bookshelves have been replaced with brightly colored furnishing of a modern learning centre. The third floor south western corner of the building has a number of teaching spaces. There are informal areas with cafe and lounge style seating. There is also a kitchen and a printer room.

Smaller rooms seat eight people at a wedge shaped table with a computer screen at one end. Larger rooms seat 28 people at oval tables, each with two computers.

The same wedge shaped tables are used in tutorial rooms and in the open plan area. These tables are made up of a smaller and larger half and can be separated to make two tables to seat four students each.

While the Hancock Library retains its 1960s brick rectilinear exterior, the new teaching space uses the same hexagonal motif as the new science buildings. The internal windows in the partitions of the space are hexagonal and even the white boards are hexagonal ( traditional rectangles might have been more practical). The corridors outside the rooms have informal bench seating.

The layout makes a good use of space, but the rooms are perhaps a little too well isolated from the outside. However, the result is the rooms are very well insulated and so able to retain a library atmosphere outside while a vigorous debate happens in a group in a room. The furniture also looks robust.

The use of standard desktop computers makes for low cost and ease of maintenance. However, the relatively large boxes take up a lot of room on the small desktops. There is also a clutter of cables behind each computer, not all of which is retained in the slot in the desktop. The placement of the computers does allow good sight lines from the each seats to the whiteboard. But the fit out would have looked neater with smaller computers being used. An alternative would be to place the processor boxes on a shelf under the desks, with an extension cable for USB access (access to the DVD drives is rarely required).

Some thought on energy use has gone into the design also, with the room lighting automatically activating when I entered a room. The fluorescent ceiling lighting is low glare, but appears to be set to abruptly switch on and off. This can be disconcerting when you are in a room and suddenly the corridor lighting goes out. A gentle transition would be better (as would LED lighting). There is a noticeable roar of air-conditioning in the rooms, but such a space would normally have a high noise level and this white noise would help mask the talking.

Australian Energy White Paper

The Department of Resources, Energy and Tourism issued a "Draft Energy White Paper 2011: Strengthening the foundations for Australia’s energy future", 13 December 2011.

The report mentions that communications and computer control can be used to improve energy efficiency. However, apart from mention of "Smart Grids" there seems to be no recognition of how large a proportion of energy use ICT is nor how it could be used to reduce energy use.

Table of Contents:

Complete document:


MIT Free Open Learning Software

MIT has announced that it is making its MITx open learning software available free in "spring 2012". But if MIT wants to offer educational facilities on-line worldwide, it will need to adjust to a wider range of cultures. As an example "spring" is at a different time of year in the southern hemisphere.Link

Tuesday, December 20, 2011

Internet Transforming Australian Economy

The report "The Connected Continent: How the internet is transforming the Australian economy" was released by Access Economics in August 2011. This was sponsored by Google Australia and, not surprisingly, says the Internet is good for the economy.

Some of the claims are very broad, as an example, report says that through e-education the Internet can
reduce administrative and infrastructure costs, as well as make long-distance education and other e-learning possible. While this is true, it is not clear if high speed broadband improves education, or if this happens without changes to educational practices.

The report cites "Program goes beyond open course model" (, 16 September 2009) and "The ICT Impact Report: A review of studies of ICT impact on schools in Europe" (European Commission, 2006). These reports indicate that changes to educational practice are required. While they mention "broadband" there is no indication how much is required. Also the educational practices mentioned, such as on-line collaboration, do not need high speed broadband.

It should be noted that the cost of reequipping schools with the equipment and trained teachers for new teaching methods would dwarf the cost of the Internet access. Also the new teaching methods would have effects beyond education. As an example, if students can receive education on-line, then there would be large savings in school building and support staff costs. However, parents would then need to cover the cost of the child minding service which the school was previously providing.
Content
Executive summary 1
1 Introduction 3
2 The direct economic contribution of the internet 5
2.1 Expenditure-based estimate 9
2.2 Income-based estimate 10
3 How the internet is transforming the economy 11
3.1 The impact of the internet on businesses 14
3.2 Public sector and government 20
3.3 The internet’s benefits to households 24
4 Charting the growth in internet use 29
4.1 Access 32
4.2 Use 34
4.3 Expenditure 38
5 Prospects for the internet economy 39
5.1 Household access 42
5.2 Intensity of use 44
5.3 The outlook for the internet economy 46
References 47
Appendix: Methodologies 49

Executive summary

The internet has transformed the Australian economy over the last 10 years, and is poised to play an even greater role in our daily lives and businesses as Australia positions itself to become a leading digital economy.

To help reach the goal of becoming a leading digital economy, this report aims to promote a deeper understanding of the role of the internet in the Australian economy.

The direct contribution of the internet to the Australian economy is worth approximately $50 billion or 3.6% of Australia’s Gross Domestic Product (GDP) in 2010. This contribution is of similar value to the retail sector or Australia’s iron ore exports.

There are currently some obvious and direct economic benefits of the internet, such as the 190,000 people employed in occupations that are directly related to the internet – including IT software firms, Internet Service Providers (ISPs), and companies providing e-commerce and online advertising services.

But, just as the roll out of electricity changed many aspects of peoples’ lives and transformed the way businesses operate, the internet provides wider benefits beyond its direct economic impact.

These wider benefits – which are not fully captured in GDP calculations – include:

  • Approximately $27 billion in productivity increases to businesses and government in the form of improvements to the way they operate and deliver services. These services also flow through to consumers through lower prices and the introduction of new products
  • The equivalent of $53 billion in benefits to households in the form of added convenience (e.g. of online banking and bill paying) and access to an increased variety of goods and services and information.

1 The internet is a catalyst for the success of Australia’s small and medium-sized enterprises (SMEs), improving how they interact with their customers and suppliers and manage their internal operations.

A customised national survey of 150 SMEs found that:

  • There is substantial scope for SMEs to take greater advantage of the internet, with all respondents using the internet but only half having their own website
  • The benefits of SMEs getting online should flow to other Australians, as SMEs suggest they are more likely to use the internet to find additional customers and suppliers locally, rather than overseas.

    Growth in internet activity is accelerating. This is driven by infrastructure investment, the uptake of new technologies (e.g. smartphones) providing access to the internet, new applications such as social media sites and an increase in business and government use of the internet. Activity has doubled over the past four years, according to indices developed in this report:

  • More Australian households and businesses are going online and they are rapidly upgrading to faster connections as they become available
  • Australians are doing more on the internet.

    Web searches across categories ranging from banking to retail continue to increase more than 30% year on year. SMEs are steadily getting online with basic websites, and an increasing number of people are engaging with government services online

  • An index capturing consumer spending on e-commerce and business spending on online advertising has increased by 100% over the past four years. This strong growth is set to continue as Australia catches up to more developed internet economies like the US and UK, and a larger share of commerce and advertising moves online.

    The direct contribution of the internet to the Australian economy is set to increase by $20 billion over the next five years, from $50 billion to roughly $70 billion.

    • This represents a growth rate (at 7%) that is twice as fast as that forecasted for the rest of the economy and will see the internet’s contribution approach that of the healthcare sector today

  • Over the same period the growth of the internet will also result in approximately 80,000 more Australians employed in areas directly related to the internet
  • Australia’s use of the internet will expand rapidly to progressively close the gap between Australia and the world’s leading digital economies
  • These expectations reflect the rollout of the National Broadband Network connecting more Australians at higher speeds, government and business making better use of the internet, and government developing a policy framework that supports investment and innovation in the internet economy. ...
  • From: The Connected Continent: How the internet is transforming the Australian economy, Access Economics in August 2011

    Monday, December 19, 2011

    Human-centered Computing Talk in Canberra 5 January 2012

    Professor Nicu Sebe, University of Trento (Italy) will speak on "Human-centered Computing: Challenges and Perspectives" at the NICTA Seminar Room, 7 London Circuit, Canberra, 5:00 pm, 5 January 2012.
    IEEE ACT - Human-centered Computing - IEEE Computer Society seminar in Canberra

    Title: Human-centered Computing: Challenges and Perspectives

    Date: Thursday 5 January 2011
    Time: 5:00 pm - 6:30 pm
    Location: NICTA Seminar Room, Ground floor, 7 London Circuit, ACT 2601

    Abstract: Human Centered Computing (HCC) is an emerging field that aims at bridging the existing gaps between the various disciplines involved with the design and implementation of computing systems that support people's activities. HCC aims at tightly integrating human sciences (e.g. social and cognitive) and computer science (e.g. human-computer interaction (HCI), signal processing, machine learning, and computer vision) for the design of computing systems with a human focus from beginning to end.

    This focus should consider the personal, social, and cultural contexts in which such systems are deployed. In this presentation, I discuss the existing challenges in HCC and describe what I consider to be the three main areas of interest: media production, analysis (especially retrieval issues), and interaction. I will present my current research and how this is reflected into the HCC paradigm. In addition, I will identify the core characteristics of HCC, describe example applications, and propose a research agenda for HCC.

    Bio: Nicu Sebe is a Professor with the Faculty of Cognitive Sciences, University of Trento, Italy, where he is leading the research in the areas of multimedia information retrieval and human-computer interaction in computer vision applications. He was involved in the organization of the major conferences and workshops addressing the computer vision and human-centered aspects of computer vision and human-centered aspects of multimedia information retrieval, among which as a General Co-Chair of the IEEE Automatic Face and Gesture Recognition Conference, FG 2008, ACM International Conference on Image and Video Retrieval (CIVR) 2007 and 2010, and WIAMIS 2009 and as one of the initiators and a Program Co-Chair of the Human-Centered Multimedia track of the ACM Multimedia 2007 conference.

    He is the general chair of ACM Multimedia 2013 and a program chair of ACM Multimedia 2011. He has served as the guest editor for several special issues in IEEE Transactions on Multimedia, IEEE Computer, Computer Vision and Image Understanding, Image and Vision Computing, Multimedia Systems, and ACM TOMCCAP. He has been a visiting professor in Beckman Institute, University of Illinois at Urbana-Champaign and in the Electrical Engineering Department, Darmstadt University of Technology, Germany. He is the co-chair of the IEEE Computer Society Task Force on Human-centered Computing and is an associate editor of IEEE Transactions on Multimedia, Machine Vision and Applications, Image and Vision Computing, Electronic Imaging and of Journal of Multimedia.

    Welcome ! Please feel free to invite your colleagues to this event.

    IEEE ACT web page is http://www.ieeeact.org/

    Introduction to Rubrics

    The book "Introduction to Rubrics: An Assessment Tool to Save Grading Time, Convey Effective Feedback and Promote Student Learning" by Dannelle Stevens and Antonia J. Levi gives a good, short (131 pages) overview of how to make marking for university assignments easier. A rubric is a table laying out what and how the assignment will be marked. This has particular value with e-learning, where the assignments are submitted on-line and the marking sheet can also be in electronic format (are there rubrics built as add-on modules for Moodle?).

    But as with many educational innovations, rubrics require more up-from work by the teacher (or educational designer). The rubric may save time later when marking (and form not having to justify the marking to the students individually) but takes work in advance to create.

    Also the idea of reducing marking to ticking or circling some items in a table may offend the academics' view of themselves. They want to be seen as providing detailed scholarly advice to students, not just doing tick and flick multiple choice marking. But as the book points out, the student's have difficulty understanding detailed comments and find detailed corrections of their work insulting.