Sunday, October 31, 2010

Peter Cundall at the National Library of Australia

Peter CundallWhile wating in the queue at the the National Library of Australia, I noticed garding celibrety Peter Cundall at walk in with the library senior staff. He is here for the Friends of the National Library celebrate the life and work at 2pm:

Peter Cundall's name is synonymous with gardening in Australia, and his articles in the 'Gardening Australia' magazine, as well as his much loved book 'The Practical Australian Gardener', have inspired generations of new gardeners to get their hands dirty. With a keynote speech by Holly Kerr Forsyth, garden writer and historian, and Peter Cundall in conversation with Alex Sloan of 666 ABC Canberra, this is an event not to be missed. ...

Beauty Bar for Travellers Sox

Dove beauty cream barWhen travelling, for a week or a month, I take one carry-on sized wheeled backpack, so space is at a premium. In place of soap, shampoo and laundry detergent, I pack a detergent bar, to wash hands, hair and clothes. This looks like a cake of soap, so does not worry security staff as much as a bottle of liquid and bag of white powder would.

Previously I used laundry detergent bar (Sard Wonder Soap), but I noticed this removed colour when washing old socks, so I worried what it might do to skin. Dove beauty cream bar and similar products are detergent bars, using the same sort of mild detergents used in liquid body wash and shampoo, plus some "moisturisers".

I could not find an independent review of the Dove bar, but Choice Magazine rated Dove shampoo highly. I have found the Dove bar works fine for washing hands and hair. It can also be used for hand washing clothes and will still lather in salt water.

Dove Beauty BarAldi are selling regular 100g Dove bars for 99 cents (about four times the price of ordinary soap). But I am using the "extra sensitive" perfume free version which is $2 for 100g in packs of four at supermarkets.

ps: also offer "Dove For Men", which comes in a grey wrapper and presumably has a masculine perfume. ;-)

US Military Use Google Android

The U.S. Special Operations Command (USSOCOM) is asking industry for a Google Android, Internet and web technology for the "Tactical Situational Awareness Application Suite" (TactSA). This is to provide a peer-to-peer wireless network, moving map display, instant messaging, chat, multicast file transfer, whiteboarding and video in a device small enough for a soldier to carry.

The system will use Internet protocols and XML. While not mentioned, HTML5 would be an obvious inclusion. Ironically, responses to USSOCOM must be provided in Adobe Acrobat or Microsoft Word format (XML and HTML not accepted).

It should be noted that US special forces have some latitude to select their own equipment so this solicitation does not necessarily indicate that the rest of the US military will do the same. While not mentioned in the solicitation, one area the Android based system could assist with is in battery use, by rationalising the different devices currently used. This approach could be taken further by using the approach of the French Fantassin à Équipement et Liaisons Intégrés (FÉLIN), which treats the solider like a sensor platform and equipping them with a data and power network, for the devices they carry.

Sources Sought
Added: Oct 21, 2010 11:03 am
Tactical Situational Awareness (TactSA) Application Suite

USSOCOM is seeking sources with the demonstrated capability to engineer a Tactical Situational Awareness (TactSA) software suite that will provide reliable and standardized data between battery operated micro-computers over Mobile Ad hoc Wireless Networks (MANETs). The data will traverse peer-to-peer networks without centralized servers or syncs. It will consist of numerous video streams, file transfers, Position Location Information (PLI), and whiteboarding collaboration tools.
TactSA display Application: Currently SOF has a FalconView-based SA display tool that provides the necessary capability on Windows platforms. Due to the shift in commercial hardware to mobile, battery powered systems, it is necessary to extend the SA capabilities to lighter devices that utilize the Android operating system. With the standardized data structure design described below, the user requires the ability to transition from a windows based system running FalconView to an Android based system running the TactSA display. The computing device will be connected to a GFE MANET dismount radio via USB or Ethernet. The TactSA display shall be the core moving map display for the display of all information available on the network.
Sub Applications: Additional applications and features are necessary to launch from the TactSA display that also utilize the tools, structures, protocols and mechanisms identified below for peer-to-peer networks. In addition to the dynamic SA mapping application the vendor must provide a chat application (Instant Message and chat room), reliable multicast file transfer, multi-touch whiteboarding aka John Madden tool, and ability to display multiple H.264/MPEG 4 video streams individually and simultaneously. These applications shall also recover from network outages and substantial packet loss.
Application reliability: At present many applications are developed for wired networks that do not handle network outages and retransmissions well. They rely on TCP, which has been proven not to be well suited for wireless networks. The government requires that fault tolerance be built into the TactSA, Chat, PLI and file transfer applications. The overall solution should be light enough for ultra mobile personal computers and tablet systems that cannot buffer large amounts of data. It shall provide an automatic data structure repair.
Reliable Multicast: The tactical MANET shall consist of at least two MANET networks connected via a commercial router. At any point each MANET must function independently to transfer the data seamlessly to the end client devices. As data flows within one over to the other network, packet-loss due to RF effects will occur. It is necessary to provide a Durability and Retransmission mechanism for application reliability. Such a protocol shall provide end-to-end reliable transport of data streams over generic IP multicast routing and forwarding services as described in RFC 5740. Due to the real-time nature of the data, multicast dissemination is critical to the effective use of the networks.
Application security: Although the radio systems provide encryption capability over the air, it is necessary to provide a second layer of security at the application layer. Third party VPN/IPsec technologies do not natively support multicast and do not provide peer-to-peer topologies. The encryption scheme shall utilize Suite B algorithms for protecting the data prior to traversing outside of the computational device. The approach shall have to be explained and designed for Federal Information Processing Standards (FIPS) accreditation for FIPS PUB 197.
Data structure: Due to numerous Graphic Information System (GIS) display applications it is necessary to utilize RFC validated protocols and extensible markup languages such as XML and derivatives to standardize the data. The objective capability is a seamless data architecture that delivers the network traffic to numerous GIS display technologies such as FalconView and Google Earth. The structure will be defined in a document as well as implemented in the TactSA suite.
Demonstrated capability to perform the work described (such as tangible examples, experience or past performance) are more desirable than novel approaches, although both will be given due consideration. No format for the response is specified and no questions will be answered. Please limit the technical approach to ten (10) pages and the entire submission (including supporting information) shall not exceed thirty (30) pages. Only electronic submissions in Adobe Acrobat 8.0 (or higher) or Microsoft Word (1997-2007) will be accepted (no hard copies). Potential sources must possess an active Facility Clearance (FCL) or be eligible to obtain an FCL at the SECRET level. ...

From: "Tactical Situational Awareness Application Suite" (TactSA), U.S. Special Operations Command (USSOCOM), 21 October 2010

Saturday, October 30, 2010

Nook Color eBook

Nook Color eBookFrom the description at Barnes and Noble, the Nook Colour eBook Reader appears to be a tablet computer much like the Telstra T-Touch Tab, for about the same price. Expect to many more of these 7 inch Google Android tablet computers before Christmas, for under $300.

Friday, October 29, 2010

iPad in medical training

iPad being used for Post-critical careThe EDUCAUSE Instructional Technologies Constituent Group has been discussing use of the Apple iPad in education. What struck me was the number of universities introducing the iPad for training doctors, pharmacists, dentists and nurses.

Apple on their own web site feature the iPad for Post-critical care. That is in educating the patient about what to do to recover. This is an interesting choice of application, as it avoids most of the issues to do with reliability, performance and security, which apply to medical records applications. The iPad would be used essentially as a fancy flip chart to show the patient diagrams. So it does not much matter if it breaks and it would not need to hold sensitive medical records.

If the iPad works okay in training, as I expect it will, this will create a demand for tablet computers to be used throughout medicine. There have been attempts at using tablet computers previously in hospitals, but these have tended to be large, heavy devices with cumbersome software. The iPad might be what makes e-heath popular and practical.

The ACT Health Library, provided by the Australian National University, has a list of iPhone/iPad/Mobile medical reference works available. This includes several reference works which require medical staff to register with their corporate identification to gain access. Others are apps for a moderate charge. There are also dozens of free items for medical students and patient education.

Thursday, October 28, 2010

Graphical Navigation of National Archives

Greetings from the Austrlaian National University where software engineering student projects are being showed. One of these is the"Graphical nagivator of CRS", by Ben Archer, Kin Ho Chan, Donghang Chen and Steven Hutchinson. This provides a graphical way to navigate around the archives records, starting from the Prime Minister, down.
The National Archives holds records that have arisen from successive federal governments since 1901. Over that period, at the time each government has
changed, it has been the practice for the administrative structure of the government to be altered and this has often also occurred within the life of a government. As records have commonly been transferred to the Archives after a significant lapse of time, it is usual for the records to relate to a structure of government that is no longer current at the time of transfer. In order to maintain the necessary provenance ...

From: Graphical nagivator of CRS, by Stephen Ellis, National Archives of Australia, 2010

National Electricity Market Model

Greetings from the Australian National University where software engineering student projects are being showed. One of these is "National Electricity Market Model" (NEMMOD), by Zakaria Bouguettaya, Andrew Fung, Andrew Jackson and Tatiana Vassilieva. The idea is to model electricity use in the Austrlaian grid to better predict electricity demand so as to optimise investment and minimise the effect of climate change.
Australia’s National Electricity Market (NEM) physically links more than 100 power stations in five states: Queensland, New South Wales, Victoria, South Australia and Tasmania. It is a dynamic, complex entity that endeavours to meet a fluctuating demand for electricity by drawing on those generators that can provide the least-cost power at each moment in time.

Coal-fired, water-cooled power plants dominate the NEM. The operation of these plants depends directly on the availability of water and on ambient temperature, and so the NEM is sensitive to climate change. Similarly, the demand for electricity also depends directly on rainfall and temperature, and is likewise sensitive to climate change. The overall reliability of the NEM (its ability to meet demand) therefore depends on the ability of the individual power stations to function economically under conditions that are expected to become steadily drier and hotter.

There is, however, considerable uncertainty concerning the way that Australia’s climate might change. While significant increases in average temperatures are likely, change in rainfall patterns is less certain—but an increase in rainfall variability is likely. The NEM covers Eastern Australia, from the tip of the Cape York Peninsula to the southern-most tip of Tasmania. This means that the power stations connected together in the NEM will operate in very different climate regimes. ...

From "Specification for a National Electricity Market Model", Barry Newell
Fenner School of Environment and Society, ANU, 2010

CEA Radar Target Simulator

Greetings from the Australian National University where software engineering student projects are being showed. One of these is the CEA Radar Target Simulator, by Kevin O'Shea, Peter Adams, Edward Hood, Dang Nguyen and Sean Wellsmor. CEA is a Canberra based defence technology company which develops radar arrays for warships. These antennas are being used to refit the RAN ANZAC Frigates and will be installed on the Air Warfare destroyers (working with the US AEGIS system), used for tracking aircraft cruse missiles and ballistic missiles.
The application shall:
• Be written using C++ and Qt
• Operate on a desktop PC running MS-Windows (XP/Vista/7)
• Model 3D (ie bearing, range, elevation) aircraft, missile and surface targets
• Be able to model at least 300 simultaneous targets
• Model own ship movement
• Output target positional information relative to own ship
• Provide basic nautical chart facilities showing land and water, latitude/longitude
• Display ownship position on chart
• Display targets on chart display
• Provide for pan and zoom on chart display
• Allow for operator selection of targets
• Allow for viewing target properties and information
• Generate target trajectories based on a script file
• Allow modification of targets and trajectories during runtime
• Allow for deletion and addition of targets during runtime

There are also a number of features we want to be able to add on later that could be scope for further development if the initial requirements are not sufficient:
• Environment data (eg weather, temperature , wave height etc)
• Electronic Counter Measures (electronic jamming)
• Identification Friend or Foe codes
• Modelling of storm movements
• Chaff (physical jamming) ...

From "Chart based radar target simulator", CEA Technologies, 2010

US Views on Battery Backup for Telecommunications

Stephen Conroy, Minister for Communications as stated that the Government has instructed the national broadband network that battery back-up will be mandatory. The units installed in Tasmanian homes so far have provision for a battery providing four hours of operation. It is not clear if, or when, the batteries will be installed.

The California Public Utilities Commission looked at the issue in detail in in Decision 08-09-014 September 4, 2008. The Commission looked at three options:
  1. No backup requirement.
  2. Four hours.
  3. Eight hours.
It should be noted that this was four or eight hours of standby time, not talk time (that translates into about 30 minutes to one hour of talk time for a phone call). It was not clear to me if the commission made a decision on this, or what the decision was.

Following Hurricane Katrina, the US Federal Communications Commission determined (FCC 07-177, October 4, 2007), that telephone central offices must have a minimum 24 hours backup power and cell towers for mobile phones eight hours. However, it is not clear if this decision survived appeal.

Wednesday, October 27, 2010

Ubiquitous Broadband Services Reducing Carbon Emissions

In "Environmental Load Reduction Effects of Ubiquitous Broadband Services" (2007) NTT researchers compared the CO2 emissions from purchasing music in a physical store with downloading it using a fibre to the home broadband network and a wireless network. They concluded that the most energy efficient is the wireless network (31 kg CO2e/year), followed by fibre to the home (108 kg CO2e/year) and lastly conventional physical delivery (201 kg CO2e/year). I get my Green IT students to examine this analysis.

This is relevant to the discussion of the NBN in Australia, which is a fibre to the home system. A wider analysis might show better savings for the fibre system, as there are energy savings tasks which could be performed with it, such as high resolution video replacing travel, which are not possible using lower speed wireless.

Techniques could also be used to reduce the power consumption of the NBN. The MaxLinear MxL261 digital cable front end chip, for example, has a low power mode which can be used when less bandwidth is required and to save battery power.

Tuesday, October 26, 2010

Information Professional Skills Framework

The UK Government National Archives issued a Government Knowledge and Information Management Professional Skills Framework in April 2009. This is intended for recruiting and developing knowledge and information management professionals in the UK government. It has been suggested as something I could use for defining skills for an Electronic Data Management Course.

The framework defines four levels of roles, from lowest to highest: Practitioner, Manager, Leader and Strategist.
SFIA has seven levels, from 1 (lowest) to 7 (highest): 1. Follow, 2. Assist, 3. Apply, 4. Enable, 5. Ensure/advise, 6. Initiate/influence, 7. Set strategy/inspire/mobilise.

It would seem to be reasonable to match "Strategist" to SFIA 7, as the word "strategy" is in the SFIA description.

There are then six SFIA levels to match with three GKIMPS Roles. The simplest solution is to match each GKIMPS role with two SFIA Levels:

Proposed Alignment of SFIA Levels with GKIMPS Roles
1. FollowPractitioner
2. Assist
3. ApplyManager
4. Enable
5. Ensure/adviseLeader
6. Initiate/influence
7. Set strategy/inspire/mobiliseStrategist

These may seem obvious, but on my first attempt I had for Practitioner matched to SFIA Levels 1 to 3, Manager Levels 4, 5 and for Leader just SFIA Level 6. But that then did not give SFIA Level 5 (which is the level of the course I am designing) enough responsibility. The new arrangement is not perfect as it suggests that managers should "apply", when that is really a job for a practitioners.

A diagram is provided showing the structure of the skills framework:

A one page Overview (PDF 17.6 KB) is provided, which expands on the structure giving more detail.

There are four categories of skills in the framework, each applying to all roles:
  1. Strategic planning for knowledge and information management
  2. Using and exploiting knowledge and information
  3. Managing and organising information
  4. Information governance
Previously I identified the four ASA/RMAA Knowledge Domains in their "Course Accreditation Check-list":
  1. Purposes and characteristics of records and recordkeeping system
  2. Context
  3. Recordkeeping processes and practices
  4. Underpinning Knowledge Domain: Recordkeeping theories and principles
Also I identified level 5 SFIA Skills relevant to records management:
  1. Information management
  2. Information policy formation
  3. Information content publishing
  4. Methods and tools
  5. Business analysis
  6. Data analysis
  7. Database/repository design
  8. Usability evaluation
This is not counting Procurement and the Quality management skills I was not sure about including. I am not sure how to reconcile these three views.

The Government KIM Professional Skills Framework – Full Version is 31 page (PDF, 321 KB).

Using the Government Knowledge and Information Management Professional Skills Framework .... i
Acknowledgements .... iii
  1. Strategic planning for knowledge and information management ... 1
    1. Organisational planning for knowledge and information management .... 1
    2. Demonstrating the value of knowledge and information management .... 5
    3. Strategic development of knowledge and information management capability ... 6
    4. Selection and procurement of knowledge and information management resources ... 8
  2. Using and exploiting knowledge and information 9
    1. Knowledge sharing and collaboration .... 9
    2. Information re-use and information sharing . 12
    3. Information analysis ... 14
    4. Integrating knowledge and information management capabilities into the business process ... 16
  3. Managing and organising information .... 17
    1. Information architecture and information control.... 17
    2. Creation and maintenance of information and records .... 19
  4. Information governance ... 22
    1. Information risk management... 22
    2. Compliance with information legislation, regulation and corporate standards ... 24
    3. Ethics.... 26
There is also a set of Frequently Asked Questions for the Knowledge and Information Management Function (121 KB).

Monday, October 25, 2010

Australian Climate Advocacy Fund

James Thier will be talking about the Climate Advocacy Fund, 6pm, 28 October 2010 in Canberra. Unlike other ethical investment funds, this one aims to change the behaviour of the organisations it in invests in to make them greener. It is available as a superannuation investment from Australian Ethical Investment. I am one of the "nominees" for the scheme: the fund issues me with a bundle of shares from the fund so I can vote at shareholders meetings (I don't get the money from the shares, this is just a way to have a voice at the meetings). Considerable research and thought has gone into the way the fund works. James Thier, undertook international research into such funds as a Churchill scholar.
“... Launched in July 2010, the Climate Advocacy Fund is a groundbreaking way to influence corporate behaviour. The fund will pursue improved climate change performance from Australia’s largest companies, principally through resolutions at annual general meetings. Managed by Australian Ethical Investment, the Climate Advocacy Fund is the first fund of its kind in the world. To find out more please come along to our seminar. Canberra Pilgrim House Conference Centre 69 Northbourne Avenue (Gifford Room) Thursday 28 October 2010 - 6-7pm The seminars are free to attend. Light refreshments provided. Please bring along friends and family. The speaker will be James Thier, one of the founding directors of Australian Ethical. James is also a Churchill Fellow, having travelled to the US and Europe to study shareholder advocacy. James will also outline how Australian Ethical’s other funds seek out and support leading sustainable companies, and have been strong performers during recent share market volatility. RSVP to Sally Rowland by email to or by phoning 02 6201 1902. We look forward to seeing you there. ..."

Battery back-up mandatory for NBN?

Are backup batteries being provided to all NBN customers as the Minister for Communication claims, or are these a customer installed option, as the NBN documentation states?

On ABC Radio in 2009 I stated that:
"One thing the NBN needs is a battery backup lasting at least 4 hours, so the system keeps running in an emergency. Failing to design the system for this would be unethical for ICT professionals involved in the project. The responsible decision makers involved, from the minister down would have to answer to a court if deaths result during a disaster."
Stephen Conroy, Minister for Communications appears to agree with this point of view. On the ABC Insiders program yesterday, he said:
"... And to give you another example, a story was written on Saturday I think, yesterday which suggested that we were going to be not having battery back-up.,

The journalist was told the Government has instructed the national broadband network that battery back-up will be mandatory. Yet the story still appeared without any reference to that. ..."
However, the NBN Tasmania web site says that backup batteries are an optional extra:
"The NTU is supplied with a 240-volt regulated power supply. You can purchase and install a back-up battery to minimise the risk of interruption to the telephone service when there is a power failure. Once the battery is installed and charged, the NTU will remain operational for up to 4 hours in the event of a power outage. Unless a battery is installed and maintained by you or your retail service provider you will not be able to make or receive any phone calls, including calls to emergency 000 services, during a power failure."
The battery brochure from the Tasmanian NBN gives details for the customer to install the optional backup battery.

So who is correct: the Minister, or the NBN documentation?

Sunday, October 24, 2010

Designing for Canberra

Award-winning architect Aldo Giurgola, will reflect on his design for the Australian Parliament House and its integration into Walter Burley Griffin’s original concept from 1912 for the national capital, 2pm, 14 November 2010, at the National Archives of Australia in Canberra:

Speakers Corner

Designing for Canberra

Join a fascinating discussion with the award-winning architect of Australia’s Parliament House, Aldo Giurgola and colleagues Hal Guida and Pamille Berg. They will reflect on the challenges and opportunities of designing a building of such significance that also needed to integrate with Walter Burley Griffin’s original concept from 1912 for the national capital.

A highlight will be the opportunity to view original drawings by both Aldo Giurgola and Marion Mahony Griffin.

Event information

Date & Time: Sunday 14 November, 2.00 to 3.30pm

Location: Menzies Room, National Archives of Australia, Queen Victoria Terrace, Parkes ACT

Audience: Public

Cost: Free

Bookings essential: Phone (02) 6212 3956 or email

Livable city has one quarter million people

The ADC Cities Report: Enhancing Liveability (Report part 1 and Report part 2) was released 22 October 2010 Australian Davos Connection. Anthony Albanese, Minister for Regional Development and Local Government made a speech at the launch.

The report suggests that cities of 250,000 to 300,000 people will have the benefits of scale and density. These cities can be relatively self contained parts of a larger whole:
"...entirely new cities can get built with 250,000 - 300,000 people as a meaningful goal, or parts of existing cities can be re-imagined around this sizing. Large conurbations of 250,000 - 300,000 person nodes can provide for a sense of spatial identity and boundary, while maintaining significant public open space such as forests between nodes, and allowing each node to contain many aspects of urbanity, such as theatres, sports teams and large parks and gardens within them."
Last week for the Canberra 2030 Planning Workshop, I suggested Canberra could triple its density and population to around 1 million people. Canberra has five town centres: Civic, Woden, Belconnen, Tuggeranong, Gungahlin, with public open space, including forests, between these nodes. So each of Canberra's town centres could be expanded to a population of 250,000, while retaining the green space.

Expansion of Canberra's population would allow rebalancing of the housing stock.
The ABS expects the average Austrlaian home to have only 2.2 to 2.3 people in it by 2021. Like much of Australia, Canberra has an excess of three bedroom, and larger, detached houses in low density suburbs. The city should therefore be expanded by building the needed studio apartments, plus one and two bedroom homes, at high density in the existing town centres. The new homes should have an average of of only one bedroom each: that is as many studio apartments should be built with no separate bedrooms, as two bedroom homes.

The new homes would require only a few square kilometres of land and could be built to a high environmental standard, lowering the city's ecological footprint. Each home could have solar water and space heating, low water use and low energy fittings.

Providing high density accommodation around the town centres would allow for efficient public transport. The new housing would be within walking distance of employment and high speed rapid transport between the town centres. The accommodation could be built quickly and efficiently using modular construction techniques.

Canberra's existing suburbs could be left largely unchanged, to provide a low density housing option for one third of Canberra's population. There is no financially or environmentally sustainable way to provide public transport to these suburbs. So those choosing to live in the suburbs would have to use private cars, with increasing fuel costs. But at least this would be choice these residents could make, knowing there is the option of a more sustainable high density inner city living. At present there is no option but a high cost high environmental impact suburban house for most Canberra residents.

ADC Cities Report: Enhancing Liveability Report

Edited by Anton Roux, ADC Forum and Professor John Stanley, The University of Sydney

Report part 1: 83 pages, (4.9Mbytes)
Chairman’s Foreword 01
About ADC Forum 04
ADC Cities Summit – Program Overview 07
Integrating the City
- Why cities? 13
- COAG action 19
- The ‘must haves’ 24
- Future Population, better cities 33
- Governance arrangements 42
- Urban design and retrofitting 44
- Concluding remarks 48
The Inclusive City
- Introduction 51
- Principles 51
- Vision 52
- Themes 52
- Concluding remarks 61
The Ecological City
- Introduction 63
- What is an ecologically resilient city? 67
- Key approaches for achieving an ecologically resilient city 67
- Setting targets 67
- The top ideas 72
The Accessible City
- Introduction 75
- Principles 75
- Enables and inhibitors 75
- Initiatives and recommendations 76
- Vision and planning 76
- Community consultation and engagement 77
- Funding 77
- Technology 78
- Research 78
- Governance 79
- Reducing demand for travel 79
- Infrastructure pricing 79
- Skills 80
- Concluding remarks
Report part 2 : 57 pages (3.9 Mbytes PDF).

Ecological footprint of Canberra

The 2010 Walter Burley Griffin Memorial Lecture will be presented by Professor Brenda Vale and Doctor Robert Vale, 6pm, 4 November 2010 in the at James O Fairfax Theatre, National Gallery of Australia, Canberra. Seats are limited, so RSVP the Australian Institute of Architects ACT Chapter:

The Vales are sustainable housing pioneers, with their 1975 book "The Autonomous House" (new edition in 2002: "The New Autonomous House: Design and Planning for Sustainability"). The Vales developed the NABERS building rating system for the Australian Government. Work is under-way to adopt NABERS for measuring the energy efficiency of computer data centres.

In their talk they will look at how Canberra would be if its citizens were to adopt a sustainable ecological footprint.

Last week the ACT Government held a Canberra 2030 Planning Workshop. My suggestion was to triple Canberra's population (and density) to 1 million people. With the use of sustainable technologies, the population increase could be implemented while reducing stresses on the environment and using no more land.

Saturday, October 23, 2010

XML Business Process Standards

Professor Viswanath VenkateshProfessor Viswanath Venkatesh, University of Arkansas, will speak on "Are You Ready? Deployment and Success of Business Process Standards", 4pm, 12 November 2010 at the Australian National University in Canberra.

Professor Venkatesh will detail results of a study of organizations using RosettaNet, a non-profit consortium using XML business to business (B2B) e-commerce interfaces. RosettaNet was set up by the the Uniform Code Council, Inc. (UCC). The Australian affiliate is RosettaNet Australia and European equivalent is EDIFICE (which originated working on EDIFACT messages).
NCISR Industry Seminar Series

Are You Ready? Deployment and Success of Business Process Standards

Process standards, which are being increasingly deployed, can provide various benefits to organizations. Not all organizations, however, are successful in their deployment. Given that process standards are growing in importance, the key question is: Is your firm ready? Organizations that aren’t ready will either fail to implement the new processes or fail to garner the benefits following implementation.

Based on an extensive study of over 70 organizations seeking to implement RosettaNet, this talk will make a case for readiness as a key prerequisite for success. Specifically, technology readiness, process readiness and people readiness are presented as three key necessary conditions for success. Details related to what constitutes readiness, how to evaluate readiness and how to get ready will be discussed.

Speaker: Professor Viswanath Venkatesh, Professor and George & Boyce Billingsley Chair in Information Systems, Walton College of Business, University of Arkansas; Visiting Professor, Australian National University

Date: 4 – 5.15 pm, Friday, 12 Nov, 2010 (followed by refreshments)
Location: Lecture Theatre 1, CBE, Building 26C, ANU (G3 on the Map)
Parking available off Childers Stree: F2 on the Map).

Cost: No charge (but bookings essential, places are limited)
RSVP: Antoinette Bosman, by Tuesday. 9 November, 2010
(02) 6125 9827

Professor Viswanath Venkatesh

Viswanath Venkatesh, a visiting professor at ANU, is a professor and the holder of the George and Boyce Billingsley Chair in Information Systems at the Walton College of Business, University of Arkansas. His research focuses on understanding the diffusion of technologies in organizations and society. For over a decade, he has worked with several companies and government agencies in different capacities ranging from a systems engineer to a special consultant to the Vice-President, and has rigorously studied real world phenomena. Most recently, he served on an expert panel at the United Nations that focuses on the advancement of women.

The sponsorship of his research has been about $10M, including funding from government agencies, e.g., National Science Foundation and Department of Transportation. His work has appeared in leading academic and practitioner journals. His articles have been cited over 12,000 times and 4,300 times per Google Scholar and Web of Science respectively. One of his papers has been identified by Science Watch (a Thompson Reuters’ service) as the most influential article in one of the four Research Front Maps identified in business and economics.

Sponsored by

Friday, October 22, 2010

Fixing a washing machine online

After my success of using the web to help fix a toilet flush, I am attempting to diagnose a problem with the washing machine. A search on "Fisher and Paykel Fisher Paykel Intuitive Eco Washer not draining" found useful items in That gave me the confidence to tile up the washing machine and look underneath. But the problem was not quite the same as described so I looked for "pump hot" and found the Fisher Paykel Washer Diagnostic Fault Codes. This said:
The pump is fitted with a thermal cut out device. Check if this device has been activated. If it has wait until the pump cools down before restarting. Check for any pump blockage and condition of pump before attempting to restart. i.e. pump seizure ...

US Department of Commerce on Green IT

The Office of Technology and Electronic Commerce, International Trade Administration, the United States Department of Commerce has released a presentation on Green IT, by Tim Miles (19 May 2010).
The Green IT presentation focuses on the impact that IT has on energy consumption and the role of Green IT in energy-efficiency and carbon abatement. It also provides a review of best practices and examples of the energy and cost savings that can be achieved through Green IT.
Here are the notes from the slides:
Good morning!
I am here representing the Office of Technology and Electronic Commerce in the U.S. Department of Commerce’s International Trade Administration.

The mission of our office and other ITA industry offices is to advocate for a domestic and international trade environment that supports U.S. competitiveness and innovation.

As the first speaker in this session, I am going to present an overview of what our office found in researching Green IT last year and why we believe greening IT infrastructure is worthwhile for U.S. manufacturers. This effort is a part of a larger Sustainable Manufacturing (SMI) underway in our agency which I will briefly discuss at the end of this presentation.

Our objectives in undertaking a Green IT Initiative are:
  • To help U.S. companies, particularly smaller enterprises, to assess, manage, conserve and reduce the energy consumption of their IT infrastructure and thus become more cost competitive with foreign firms and
  • To contribute to U.S. efforts to deal with global warming and to reduce dependency on fossil fuels, particularly those from foreign sources
I would like to start out by answering the question “What is Green IT?”
and defining this broad concept for you.

The goal of Green IT is to make the entire IT lifecycle greener by addressing environmental sustainability along the following four complementary paths:
  1. Green use — reducing the energy consumption of computers and other information systems as well as using them in an environmentally sound manner
  2. Green disposal — refurbishing and reusing old computers and properly recycling unwanted computers and other electronic equipment
  3. Green design — designing energy-efficient and environmentally sound components, computers, servers, cooling equipment, and data centers
  4. Green manufacturing — manufacturing electronic components, computers, and other associated subsystems with minimal impact on the environment
Our office’s current initiative is focusing on green use although we do
deal with the issue of green disposal in our trade policy work.

Why has Green IT become a significant concern for U.S. industry?

A May 2009 survey of North American companies conducted by Symantec, a leading IT security software supplier, revealed that 97 percent of respondents had discussed a Green IT strategy that included increasing their Green IT budgets and reducing energy consumption, cooling costs, and carbon emissions.

Their Green IT projects were primarily targeted at the data center, but were growing in other areas such as corporate desktop environments.

International Data Corporation (IDC) surveyed 1,653 firms around the world later in the year to evaluate what were the most pressing issues motivating them to adopt a Green IT strategy.

Not surprisingly, the top reason was the cost of energy followed by the
growth in corporate IT infrastructure.

Indeed, IDC has estimated that the annual cost of IT energy will surpass
that of IT equipment within the next 5 years.

I think that it is important to understand ICT’s role as an energy consumer
and an energy efficiency enabler---the phenomenon that some have called
“the ICT Energy Paradox.”

According to the American Council for an Energy-Efficient Economy (ACEEE), the ICT Energy Paradox is one in which more attention tends to be paid to the energy-consuming characteristics of ICT rather than to the broader, economy-wide, energy-saving capacity that emerges through their widespread and systematic application.

ICT has played and will continue to play a critical role in reducing energy waste and increasing energy efficiency throughout the economy. From sensors and microprocessors to smart grid and virtualization technologies, there is a strong correlation between efficiency, productivity, and energy savings.

And while discrete technologies have successfully enabled significant energy savings, system-wide energy savings have also emerged from the growing ubiquity of ICT systems and technologies.

In terms of its environmental impact, some studies indicate that the manufacture and use of ICT currently produces 2-3% (approximately 0.86 metric gigatons) of the world’s CO2 emissions, equivalent to the carbon
output of the entire aviation industry.

The ICT sector’s global carbon footprint is set to nearly double to 1.43 gigatons (Gt) by 2020, based on business as usual (BAU) projections made by the Climate Group.

PCs and their associated peripherals and printers will account for 57% of this ICT footprint followed by telecom infrastructure and devices at 25% and data centers at 18%.

According to a study of the energy consumption of the Internet conducted back in 2007, ICT equipment makes up about 5.3% of global electricity use and 9.4% of total U.S. electricity demand.

As was the case with ICT’s carbon footprint, PCs and monitors consume far more electricity than data centers and communications equipment.

The International Energy Agency (IEA) predicts that the energy consumed by ICT worldwide will double by 2022 and increase three fold by 2030 to 1,700 terawatt hours (tWh). This will equal the current combined residential electricity use of the United States and Japan.

This consumption will require the addition of nearly 280 Gigawatts (GW) of new generating capacity between now and 2030, presenting a great challenge to electric utilities throughout the world.

In 2006, servers and data centers used 61 billion kilowatt hours (kWh), or more than 1.5 percent of all the electricity generated in the United States, at a cost of nearly $4.5 billion, according to the Environmental Protection Agency (EPA). kWh consumption was twice the 2000 level.

EPA projections show that U.S. data center energy use alone could almost double to more than 100 billion kWh by 2011 for a cost of $7.4 billion and will account for 2.9 percent of U.S. electricity production. This share is projected to rise to 12% by 2020.

Compared to current efficiency trends, a combination of improved operations, best practices and state-of –the-art technologies in servers and data centers in the United States could have resulted in annual savings of approximately 23 to 74 billion kWh, $1.6 billion to $5.6 billion in electricity costs, and 15 to 47 million metric tons of CO2 emissions by 2011.

In the typical PC environment, the average desktop/monitor combination
uses up to 2000 kWh of electricity annually of which 500-1000 kWh can
be reduced through simple power management. Laptops are more efficient, consuming far less energy each year.

A Harris Interactive study finds that half of U.S. workers fail to shut down their PCs at night. EPA estimates also show that 90 percent of enterprise desktop PCs do not use power management capabilities.

The EPA estimates that Americans would save $200 million in annual energy costs if they purchased Energy Star-qualified home office products, such as computers, printers, monitor/displays, copiers, and faxes.

Here is a list of some best practices that government and private sector energy-efficiency experts recommend for data centers.

At the top is the suggestion that companies considering greening their IT infrastructure should begin this effort by developing a strategic energy plan with realistic reduction goals.

A number of the energy-efficiency measures listed involve costs such as upgrading to more energy-efficient computers and peripherals and implementing virtualization technologies.

Others have to do with less costly practices such as powering down and retiring underutilized servers.

As I will show in the next slide, adopting these best practices will very often bring savings that can offset the costs of implementing them and significantly reduce electricity consumption.

Server replacement and consolidation is one of the most effective energy-efficiency best practices.

In this example, 184 servers that were installed in 2005 have been replaced by and consolidated into 21 new more energy efficient systems, with greater computing capability, resulting in a 92% reduction in annual electricity consumption.

The costs of new hardware and the operating systems licenses in the first year total $165,900. However, the consolidation cuts the costs of operating systems licenses by $146,700 and brings electricity cost savings of more than $41,000. The payback on employing this best practice is around ten months.

These practices deal with energy-efficiency in the PC environment.

As in the case of the data center, there are practices that firms can adopt without spending much money. They include turning off PCs at night and using free or inexpensive power management software.

Upgrading to more energy-efficient PCs, peripherals, and power supplies may be costly.

However, most companies have to refresh their PC installed base every 3-5 years anyway as their capacity and data processing needs expand with their business operations.

The next slide presents PC upgrade/replacement and power management scenarios and the electricity and cost savings they could achieve.

This slide shows exactly what the annual energy costs and savings would be if an installed base of 1,000 old desktop PCs with CRTs were replaced with an equal number of LCD monitors, new more energy-efficient desktop and laptop PCs, and power management technology.

Again, the electricity cost savings are significant when new managed
desktop and laptop systems are introduced.

Use of managed desktops with LCDs drops the annual electricity cost down to $22,900 and brings an annual savings of $78,600 over the cost of the energy used by the older, unmanaged systems with CRTs.

The annual electricity cost of the new laptop PCs is even lower--- $3,800---and the annual savings are nearly $98,000.

I know that many of companies may not be able to afford a major investment right now in Greening IT installations so I am providing you with a list of 10 easy, low or no cost ways to save IT energy and to cut down on CO2 emissions. Here are some of the results a firm can expect:

  • The use of power management will bring at least a 20% reduction in electricity consumption and could result in average savings of $50 per year for each PC, according to the Department of Energy ( DOE).
  • That means that simple power management of the 108 million desktop PCs in U.S. organizations could net around $5.4 billion. It would also eliminate nearly 20 million tons of CO2 each year, roughly equivalent to the impact of 4 million cars.
  • Finally, turning off desktop PCs at the end of the business day provides additional benefits since it slashes energy use 30-50%.
In wrapping up this talk and as a lead in to the next presentation, I want to provide you with a brief overview of green IT efforts within the U.S. Government.

President Obama signed an Executive Order late last year that requires Federal Government agencies to set an example for the nation by significantly reducing their greenhouse gas emissions and energy use by 2020.

The order mandates they ensure that 95% of new IT equipment purchases are Energy Star or Federal Energy Management Program compliant and are certified by the Electronic Product Environmental Assessment Tool (EPEAT).

It also requires them to implement best practices for energy-efficient servers and data center management including power management policies.

What has been the progress of Federal Green IT efforts thus far?

A CDW survey of 150 Federal IT managers in mid-2009 found that nearly half of them have reduced their energy costs for powering PCs and other IT equipment by at least 1% or more.

They noted that each of their agencies spend on average nearly $51 million annually on electricity for their IT infrastructure which represents about 13 percent of their total IT budget.

These managers believe that their agencies could actually cut their power costs by 18%, saving $9.1 million a year, if they implemented all the available best practices such as purchasing Energy Star IT equipment, using power management, and virtualizing servers, PC and storage.

Those of you who are interested in receiving more information and assistance on greening an IT infrastructure should contact the following non-profit industry groups and government agencies. (Mention some of these organizations)

Many of these organizations provide metrics and software tools for conducting an IT asset inventory, measuring both total IT and individual device energy use, and monitoring and managing IT power consumption

Here are more specific examples of public and private sector offerings so that you have a better idea of what is available to help you in your Green IT efforts. (Review examples)

Those products from U.S. Government agencies are free of charge and available on their websites.

This slide provides the web addresses of the organizations I have recommended as good sources of information and assistance for you on IT energy use and efficiency.

Finally, I also suggest that y check out my agency’s Sustainable Manufacturing Initiative for information that addresses the broader greening
of operations that a company should consider, including Green IT.

As noted on this slide, the SMI has established a Sustainable Business

Clearinghouse that is a free, online database of nearly 800 federal and state level programs and resources that enhance sustainability and competitiveness.

The database has the DOE and EPA websites I mentioned previously

I hope that you found my presentation useful.

I will do my best to answer any questions you may now have.

Please feel free to contact me at if you want a copy of my presentation or would like to discuss our office’s Green IT initiative further.

Thank you!

Wednesday, October 20, 2010

Walter Burley Griffin and Marion Mahony Talk in Sydney

The Walter Burley Griffin Society will have a talk by Professor James Weirick on ‘The Solid Rock House: Walter Burley Griffin and Marion Mahony in 1910 – innovation and inspiration’, 31 October 2010 at 3.30pm at Castle Cove library hall, Castle Cove, Sydney. This follows the 2pm AGM.
The Walter Burley Griffin Society Inc. cordially invites you to its twenty second Annual General Meeting Sunday 31 October 2010 at 2.00pm at Castle Cove library hall, 8b Deepwater Road, Castle Cove followed at 3.30pm by guest speaker Professor James Weirick ‘The Solid Rock House: Walter Burley Griffin and Marion Mahony in 1910 – innovation and inspiration’

The 2010 AGM address will explore the ideas behind the Griffins’ revolutionary flat-roofed, concrete house of 1910, innovative in itself and as precursor to the year of inspiration that culminated in the Griffin entry in the Australian Federal Capital Competition, 1911.

Light refreshments will be served.

The Castle Cove library is a five minute walk up the hill from the 207 bus stop in Eastern Valley Way. Alight near corner of Castle Cove Drive where there is a pedestrian crossing at traffic lights.

Check Sydney Buses website for bus times. Parking available in Deepwater Road. For further information contact Kerry McKillop on 02 9958 4516 or Adrienne Kabos on 02 9958 2060 (ah).

Open Model for Australian Universities

Professor Beth NoveckUniversities Australia is holding a policy forum on "The Open Model - Innovation in Government, Science and Research", featuring Professor Beth Noveck, White House Deputy Chief Technology Officer for Open Government, at Parliament House, Canberra, 28 October 2010.

Unfortunately the program makes no mention of the most important area in which the open model can directly impact the Australian universities, the community and economy: open access applied to education.

After tourism, education is Australia's largest services export, worth $18.6B in 2009. E-learning using Australian developed open source software and open access content, is transforming the way university education is provided. An example is the UK Open University.

This is an opportunity for Australia's education export industry. Australian universities can incorporate their open access research and data in online courses to remain competitive on the world market (a model I call "e-oxbridge"). If Australian universities fail to adopt e-learning, they will cease to be competitive. Billions of dollars in export income will be lost if international students choose not to study at Australian universities. Billions more dollars will be lost if Australian students choose to enrol in cheaper, higher quality, overseas online university courses and abandon Australian campuses.

Universities Australia Policy Forum 2

8.30am-1.00pm, Thursday, 28 October 2010
Mural Hall, Australian Parliament House, Canberra

The Open Model - Innovation in Government, Science and Research

Intended Purpose

It is often remarked that the transformation that society is undergoing at present is at least as great as that of the Industrial Revolution. As well as the broader social and business effects of the revolutionary changes wrought by and through the internet, there are profound actual and potential effects on the way government and related services, as well as research and science are conducted.

We are challenged to harness and strategically leverage the “new connectedness” to seek to exploit new domains of interaction. How can we assess and respond to this challenge? In particular what principles should we develop to understand the gains to be made through more open access to services and content?

Intended Audience

Parliamentarians, invited government and university representatives and other interested stakeholders.

Brief program

Thematically the half day will move through two phases, opening with a discussion about openness in government and access to government services, and moving on to related questions for open access to research, science, and research data in our universities.

These phases will be brought together through a final session in which access to innovation nationally and internationally will become the theme. The government theme will be explored through a presentation by Nicholas Gruen, and extended through international examples from key presenters.

The research questions will be opened up by John Wilbanks and others, and Richard Jeffersen
will make the linkages involving innovation and access to innovation as the end piece.

Universities Australia Policy Forum 2

8.30am-1.00pm, Thursday, 28 October 2010

Mural Hall, Australian Parliament House, Canberra
8.30 APH security clearance and escort to Mural Hall

9.00-9.10 Welcome and Scene Setting: Professor Peter Coaldrake, Chair, Universities Australia, Vice-Chancellor, QUT
9.10-10.30 Open Government
Chair and Commentator: Professor Brian Fitzgerald, Professor of Intellectual Property and Innovation, QUT
10.30-11.00 Morning Tea
11.00-1.00 Open Access to Research

Chair and Commentator: Professor Tom Cochrane, Chair, Australian eResearch Infrastructure Council, Deputy Vice-Chancellor, QUT

  • Mr John Wilbanks, Vice-President, Science, Creative Commons: Future trends in science
  • Professor Steven Schwartz, Vice-Chancellor, Macquarie University: Open Access: An institutional perspective
  • Dr Warwick Anderson AM, Chief Executive, National Health and Medical Research Council: Open Access: A funder’s perspective
  • Dr Michael Spence, Vice-Chancellor, The University of Sydney: Obstacles to collaboration and access
  • Dr Terry Cutler, Principal, Cutler & Company: Open Access and Innovation
  • Professor Richard Jefferson, Professor of Science, Technology & Law, QUT: Access to open innovation
12.30-1.00 Close

Murray-Darling Water Cuts in Google Ads

On my web site today I noticed a Google AdWords advertisement from the Australian Government "Water for Our Future". Curiously the ad was on a posting about government web standards, unrelated to water use.

This links to a Department of Sustainability, Environment, Water, Population and Communities web page about the controversial Murray-Darling Basin Authority report "Guide to the Basin Plan", which recommends cuts to irrigation water allocations. There is an online form for providing feedback and a Australian Water Education Toolkit intended for schools. Although presenting only one side of the argument, the material is well presented, about the only lack is some form of online forums where the issues could be discussed.

ps: Two of the towns which will be effected by the water plan are Leeton and Griffith, which were designed by Walter Burley Griffin and Marion Mahony Griffin.

Tuesday, October 19, 2010

Web Standards for the Australian Government

Greetings from the Australian Defence Force Academy (ADFA) in Canberra, where the October Web Standards Group meeting was held.

This meeting is sponsored by the Defence Department and so the location is appropriate. However, the ADFA Adams Hall is not the best location for a meeting. This is a multi-purpose hall with a polished wooden floor and steeply ranked seating at the back. It looks like a US college gym and at any moment I expected a cheer leading squad to march in.

Today's speakers were:

1: Neil Philips - FOI Legislative Requirements to Publish Information On the Web
2: Raven Calais - WCAG 2.0 and the Website Accessibility National Transition Strategy
3: Gordon Grace - Making Better Use of On-line Data
4: Eileen Tannachion - Changes to the Australian Government Locator Service Metadata Standard

These are all interesting topics, but this seems like twice as many speakers as it is reasonable to fit into such a format.

The MC for the day was Tony Corcoran, Assistant Secretary, Freedom of Information and Information Management Branch, Department of Defence. He admitted that the Defence Department was not well prepared for new Government FOI legislative requirements (but it sounded like they were making rapid progress to be ready in time). Defence is doubling the capacity of their records management system from 30,000 to 60,000 users. An imaging system is being installed. Also many routine manuals which are currently on the internal intranet will be moved to the Internet to avoid the need for FOI requests.

Tony modestly said that his area of Defence was not "technical" being staffed by policy people. I have no doubt that there are suitably qualified people from information disciplines in Defence to write policy (I used to be one of them).

1: Neil Philips "FOI Legislative Requirements to Publish Information On the Web".

The new FOI procedures from 1 May 2011 are for more proactive disclosure. Public servants have protection under the legislation if they accidentally release something they should not. The legislation provides for agencies to sell documents, but hopefully most material will be provided freely online.

The main requirement is that the agency publish on its web site a list of information about the information they have (metadata) an organisation chart and other details about what the agency does. What was not clear to me was if there are standards for the format used for lists of information. It would make sense if all agencies used the same format and there was a government wide search facility provided. The Information Commissioner is expected to produce guidelines in November.

2: Raven Calais - WCAG 2.0 and the Website Accessibility National Transition Strategy

Raven started by saying the presentation would be provided afterwards, so I shouted out "in an accessible format?". She replied that there were some accessibility problems with Powerpoint and so the presentations would be offered in other formats.

Raven pointed out that accessibility is not just for a small group identified as "disabled", such as just the blind. Accessibility is an issue for much of the community at different times in their lives. Currently 20% of the Australian population identify themselves as having a disability. People with a disability have a higher unemployment rate. Accessibility is about social inclusiveness.

The Online and Communications Council of Australia agreed that all federal, state and local government web sites would be WCAG 2 complaint by November 2012.

Federal FMA agencies are required to be Level A compliant by 31 December 2012 and Level double A by 31 December 2014.

AGIMO has a National Transitional Strategy. This involves agencies first taking a stock take of what web sites they have, then checking current compliance, then assessing the publishing process and what barriers there might be. Barriers might be a lack of training in accessibility for web staff in a decentralised web publishing process.

AGIMO is developing a Community of Expertise (COE) under the AGIMO blog. This will then be updated to be a forum and have a document repository.

Raven stated that PDF, RTF and MS Word files are not accessible and so accessible HTML alternatives are required for these. I found this a refreshingly frank and practical approach. In theory it is possible to create accessible PDF, RTF and MS Word files, but as Raven says, there are not sufficient tools and techniques to support these in practice. In my view a sensible approach for agencies would be to implement accessible HTML which also displays and prints well. Agencies could then dispense with the complexity and expense of creating versions of documents in PDF, RTF and MS Word. The one HTML version would be suitable for all uses.

Raven nominated Canada as leading the way with accessibility.

Expected skill sets for roles are to be released soon. This would help me in teaching web accessibility to public servants at ANU.

I was very impressed with Raven's practical approach to accessibility.

Funnelback then provided afternoon tea.

3: Gordon Grace then talked on "Making Better Use of On-line Data".

Gordon pointed out that data web sites are not that exciting and (which is not online yet) will be no exception. The idea is to provide access to "raw data" which can then be combined and processed. To me this is conceptually very similar (and uses many of the same technical standards) as data repositories for research. The USA's was launched May 2009, Australia's beta ina October 2009, UK's relaunched May 2009. AGLS, AGIFT, DCAT, Dublin Core, hCard, vCard and X500 were some standards used for

Gordon pointed out that budget papers are now being published under a creative commons licence, but the data tables are currently GIF images and so not usable as data.

Gordon mentioned that he would not be able to in in GOVDEX online discussions of data in the future as he had left the public service and so would not have a email address. I found this curious as I have previously taken part in GOVDEX discussions as an industry expert.

Gordon showed an interesting example of XHTML with RDFa embedded. This allows data fields (such as phone numbers) to be identified in web pages. This allows both human and machine readable web pages. He pointed out these could be particularly useful for smart phones, if standards were supported. One difficulty this will cause is that "hits" on web sites may be reduced as data can be extracted from the web site once and provided via an intermediary. Gordon suggested that academics working on citation statistics might have a solution for this, but judging by the session I attended at ANU yesterday on assessing research.

Gordon suggested that providing data would be one way for agencies to easily able to meet FOI requests.

4: Eileen Tannachion - Changes to the Australian Government Locator Service Metadata Standard

Eileen started by pointing out that AGLIS is now a Australian Standard AS 5044-2010 and so is not just for government use. The updated standard supports new Dublin Core features. Thesauri are AGIFT and TAGS.

AGRkMS is a subset of SPIRIT. AGLS has considerable compatibility with AGRkMS but is not a strict subset of it.

Recent changes to AGLS include: linked data, semantic web and 2008 DC abstract model. AGLS qualifiers are "aglsterms". Some XML examples and encodings were removed. Five additional DC terms were added and included in AGLS. Two new AGLS terms were added: dateLicenced and protectiveMarking. The document type vocabulary is now common with AGRkMS. AGLS.audience has been replaced ith DC.audience. DC.Coverage.postcode will be replaced with a geo-spatial term still being worked on. Some other vocabularies have been updated. Some properties are now "recommended".

Challenges with AGLS include implementing in HTML 4.01 (I wonder about HTML 5?). Also the issue Google is starting to collect some of the metadata (will Funnleback?).

Social media advice was issued September 2010. Web Archiving Policy was will be issued the end of October 2010. The new AGLS manual will be issued as an exposure draft in November 2010. This will be useful for my new "Electronic Document Management" course at the ANU.

Ruth Ellison ended by asking for help with future WSG events: organising, speakers, venues and sponsors.

Military Data Network Dependability

Cover: Navy Network Dependability: Models, Metrics, and Tools"Navy Network Dependability: Models, Metrics, and Tools" is another in the excellent series of technical report from RAND corporation on aspects of military operations. In this report the reliability of the telecommunications networks used by the US Navy is investigated.

The emphasis here is on how reliable the network shared between Navy ships and aircraft in a strike group looks to the military user. A methodology for assessing this is outlined and then applied. As with previous RAND reports the report provides a clearly written but technically rigorous analysis. However, it is limited by the brief given.

In this case the limitation of the analysis is shown by three examples of networking in a carrier battle group. These show the IP Network, Secure Voice Equipment String and the Tactical Data Link Equipment String. These are three different networks all of which have to be supported on ships and aircraft. The logical way to make this more reliable would be to use one network which could prioritise communications and route it over all avialable links. Having separate networks devoted to two different sorts of data and voice makes little sense and is a legacy of the development of such systems.

Navy Network Dependability

Models, Metrics, and Tools

By: Isaac R. Porche, III, Katherine Comanor, Bradley Wilson, Matthew J. Schneider, Juan Montelibano, Jeff Rothenberg

The Navy is increasingly dependent on networks and associated net-centric operations to conduct military missions, so a vital goal is to establish and maintain dependable networks for ship and multiship (e.g., strike group) networks. In this volume, the authors develop a framework for measuring network dependability that is focused on users' perceptions of whether individual network services are available, as opposed to hardware-focused measurements of whether individual pieces of equipment are functioning. The authors used this framework to modify a tool for modeling network availability that was originally developed by Space and Naval Warfare Systems Command; the modified tool allows the user to perform sensitivity analysis that captures the degree to which individual network components affect overall mission operational availability. The authors walk the reader through some exemplar analyses, then conclude with recommendations on how the Navy might facilitate future network dependability assessments, provide more meaningful results to network engineers, and, ultimately, enhance the dependability of networks across the fleet.

Pages: 126

ISBN/EAN: 9780833049940

Monday, October 18, 2010

The ERA era of measuring research output in Australia

Greetings from "RESEARCH ASSESSMENT AND PUBLICATION METRICS - THE BEGINNING OR END OF AN ERA?" at the Australian National University in Canberra. The question three speakers will answer is: "How is research excellence measured and evaluated? What are its key signs and indicators?" This is in the light of the Australian Research Council Excellence in Research for Australia scheme (ERA).

Colin Steele amused the audience with the story that the UK medical council showed that having a colon ":" in the title of a paper increased its ranking and that as a result more authors put colons in their titles (and Colin made the obvious pun about colons and medicine).

Andrew Calder, Director, Research Performance and Analysis, ARC, then spoke. He pointed out that not just one measure would be used for rating research, but several measures combined may be. This suggests a logical flaw in the argument: if each of the measures used is not a good one, then adding them together does not give a better result. He explained that hundreds of experts will be carrying out reviews, rather than some automated metric. As discussed later, I suggest an automated metric is likely to be more accurate and more accountable. Also the cost of carrying out the manual evaluations must be high. Ranked conferences and journals are used, with the proportion varying by disciplines (conferences are more important in IT for example).

John Wellard, Director ANU Research Office, pointed out that the ERA is likely t be used by the federal government for measuring the performance of universities and deciding funding to them. One flaw with the process John described was the use of the term "employed". As an Adjunct Lecturer I am not "employed" by the ANU, although I am regarded as a staff member. Clearly if I write a paper it should count as an ANU paper (the ANU staff keep asking me for papers to add to their census list).

Professor Andrew Cockburn, Director, College of Medicine, Biology and Environment, ANU, pointed out the many flaws in many research rankings. As an example he pointed out the "H-index" (aka "H-score"). He pointed out that such scores give distinguished pioneers in a filed a much lower ranking than later prolific but less distinguished authors. He pointed out that those who benefit from the rankings have a strong incentive to publish in a way to maximise the measures. As a result the ranking systems will need to be continually changed, removing the possibility of long term measures. Also papers are being written without a "methods" section, with a resulting loss of knowledge.

Professor John Houghton, Victoria University, talked about his extensive work on the economics of scholarly publishing, including the report on scholarly publishing for UK JISC: "Economic Implications of Alternative Scholarly Publishing Models", and more recent "Costs and Benefits of Research Communication: The Dutch Situation", "Costs and Benefits of Alternative Publishing Models: Denmark". He also mentioned related work in Germany and by Alma Swan in the UK. John went on to discuss the positive role of open access and how journal reference measures may inhibit this.

Dr Danny Kingsley, Manager Scholarly Communications, ANU, pointed out the previous speakers had already covered many of the issues. One problem she mentioned was that citations would be measured over only two years. So papers of long term worth will not count. A second problem is that the measures are of the quality journals, not individual papers. A third problem Danny identified is that high rejection rates in "quality" journals, results in rejected papers being recycled though lower ranking journals with duplicated reviewing effort.

As a solution Danny suggested that the measures should support the processes the disciple use, rather than forcing a rigid "publication" measure on them. She suggested "Web 2" measures. Danny also suggested that the social impact should be measured, pointing out this might be from practitioners using research results, rather than other researchers citing this. This all make sense to me, as I don't publish research papers myself, but do use them in industry publications. Therefore my work does not count towards the ANU's publication output, nor does my citations of that work. However, in terms of impact on the community, in the IT discipline, my publications and the papers I cite, probably have more effect on the community than all other IT papers published by ANU researchers. This is because my work is read by practitioners and government policy makers and implemented by them. Implementation is direct by IT professionals writing computer programs and implementing hardware using the techniques I suggest, by companies and government agencies implementing these as policies and through implementation in standards and laws.

Dr Claire Donovan, Lecturer in Sociology, ANU, talked about Research Impact – the Wider Dimension.

At that point there was supposed to be 40 minutes of questions and discussion. However, as the speakers ran over time there was only 4 minutes left for questions. This perhaps illustrated a problem with traditional unviersity publishing. Apart from the invitation to the event with the names and the topics of the speakers, the ANU published no materials as a part of this process. Apart from my blog posting, there are no details available as a result of this event. In terms of impact, this event will therefore have little effect as few people will be able to find details.

Some thoughts on the issue

The ARC has produced Ranked Journal and Conference Lists. Ranked journals require an ISSN (this is a problem for conferences). Andrew Calder pointed out that a publication in a "B" ranked journal may be better for the author, than an "A" ranked one, if it results in more local citations.

I did a quick search of the list to find publications I am familiar with. Curiously I could only find one of the hundred or so volumes of the "Conferences in Research and Practice in Information Technology":

17766BJournal of Research and Practice in Information Technology8Information and Computing Sciences


19280BAustralasian Journal of Information Systems806Information Systems1702Cognitive Science

42358ACS/IEEE International Conference on Computer Systems and ApplicationsAICCSAC08Information and Computing Sciences12Built Environment and Design

In my view the answer to ranking papers is reasonably obvious and along the lines Danny suggested. As research publishing goes online it will evolve to include social networking techniques, which can measure the ranking of people based on peer assessment. Essentially the current publication metrics are a crude form of such rankings, but these can be improved, refined and made much cheaper and more audit-able. There is now a credible body of research literature on this topic, but which is unknown to all but a few IT researchers. An example of how to do this is Soo Ling Lim's work, reported at ANU on Thursday: "Using Social Networks to Identify and Prioritise Software Project Stakeholders".