Showing posts with label NICTA. Show all posts
Showing posts with label NICTA. Show all posts

Monday, November 16, 2015

Data 61: Time for Ideas into Action

Greetings from Data 61 at the Australian Technology Park in Sydney, for "Australia3.0: catalysing ideas into action". There is a panel with David Rohrsheim GM Uber Australia and New Zealand,  Jason Clare MP Shadow Minister for Communications and Nick Abrahams Author “Digital Disruption in Australia”. Jason Clare had the most perceptive comment of the day, when said: "Australian researchers produce twice as many academic papers per head as their US counterparts, but half as many patents". This aspect got left out of the subsequent discussion: how to harness publicly funded research, for community benefit.

Before the panel I had a quick tour of Data 61, which was formed from the IT research components of CSIRO and NICTA. What I hope to hear was how the new organization was going to do things differently, to overcome the problems which occurred with CSIRO and NICTA, who were not able to effectively transition research into commercially successful invitations. Unfortunately what I instead saw were some old NICTA demonstrations. More than $1B has been invested by the Australian community in Data 61/NICTA and the organization has about six months to come up with a credible strategy to show a return on that investment.

The NSW Government announced 12 November 2015 that Mirvac would purchase the Australian Technology Park (ATP), with the Commonwealth Bank as the major tenant. This provides a good opportunity for Data 61 to rethink what they do and how they do it. Data 61 needs to to be able to answer the question I ask every research student in their final presentation: "How can we make money out of this?".

NICTA and CSIRO in the past have taken the approach of gently introducing its researchers to commercial considerations. Generally this approach has not worked and there is little prospect if Data 61 succeeding if it continues this approach. I suggest they need to formally training staff in innovation and entrepreneurship and have them take part in start-up competitions. Some staff may find this unpalatable and they can be encouraged to find a job in academia. The remaining staff can then get on with producing results.

Sunday, April 26, 2015

Digital Services for a New Economy in Canberra

The free event "Open Data and Digital Services: Foundations for a New Information Economy" by Open Knowledge Australia is at NICTA Canberra, 2:30 to 5:30pm, 13 May 2015.
"Hear from leading speakers covering subjects related to open data and digital Government service delivery.

Learn about the foundational elements of a new information economy that is already connecting public and private sectors throughout Australia and the world."

With:

Nicholas Gruen: Chair of Open Knowledge Australia
Jed Sundwall: Global Open Data Technical BDM for Amazon Web Services
Pia Waugh:  Director of Analytics and Discovery Layer, Digital Transformation Office
Steve Bennett: Community Contributor, Open Knowledge Australia
Steven De Costa: Open Knowledge Australia, CKAN Association and Link Digital

Drinks at 4:30pm.

Book at Eventbrite

ps: I will be discussing "Innovations in Teaching Innovation", 27 April 2015, 4pm, CSIRO seminar room in the ANU CSIT Building in Canberra: http://es.csiro.au/ir-and-friends/

Friday, February 22, 2013

Australian Government Open Information Access Report

Greetings from Canberra, where Professor John McMillan, the Australian Information Commissioner just launched the report "Open public sector information: from principles to practice".  The commissioner pointed out that agencies have difficulty with the accessibility for the disabled of their documents and do not have metadata for many documents. Another challenge was agencies implementing the government's policy of open access being the default. Professor McMillan was speaking at an event organized by NICTA as part of "Open Data Day".

Open public sector information: from principles to practice

Report on agency implementation of the Principles on open public sector information

February 2013

Contents

  1. Foreword
  2. Background to the PSI survey
    1. Outline of this report
  3. 1. Summary of findings
    1. Overview of key challenges
    2. Priority areas for action
  4. 2. Results
    1. Open PSI Principle 1: Open access to information – a default position
    2. Open PSI Principle 2: Engaging the community
    3. Open PSI Principle 3: Effective information governance
    4. Open PSI Principle 4: Robust information asset management
    5. Open PSI Principle 5: Discoverable and useable information
    6. Open PSI Principle 6: Clear reuse rights
    7. Open PSI Principle 7: Appropriate charging for access
    8. Open PSI Principle 8: Transparent enquiry and complaint process
  5. 3. Analysis
    1. Strong agency leadership
    2. Strategic management of PSI assets
    3. Public engagement
    4. Information management
    5. Using Web 2.0 to support open PSI
    6. Open licensing
    7. Charging for access to PSI
    8. Complaints and enquiries processes
    9. PSI issues for galleries, libraries, archives and museums
    10. Keeping abreast of international developments
  6. Appendix A – PSI survey methodology
    1. Survey implementation
    2. Focus groups
    3. Interim results
  7. Appendix B – Summary of UTS intern research project: Access to and use of public sector information: The academic reuser perspective
    1. Background
    2. Process
    3. Findings and recommendations
  8. Appendix C – Definitions

Thursday, February 21, 2013

Opening Government Data in Australia

NICTA are hosting "Opening Government Data in Australia – Next Steps" with the Australian Information Commissioner, in Canberra at noon on Friday, 22 February 2013. Other Australian events for "Open Data Day" are Hack for Education and Environment at UTS in Sydney and "Maleny2013".

Opening Government Data in Australia 

Description:

To support and encourage the International Open Data Hackthon (http://opendataday.org/), the Australian Information Commissioner, NICTA/eGov Cluster and the Office for Spatial Policy are doing a launch event about next steps for opening up government data in Australia.
Recording: This event will be recorded and published online after the event.
Date: Friday 22 February 2013
Time: 12:00 - 13:00
RSVP : Eventbrite

Schedule:

12:00 – The OAIC will launch their latest publication, Open public sector information; from principles to practice and Australian Information Commissioner Professor John McMillian will discuss:
  • agency successes in implementing the Principles on open public sector information
  • the challenges to be overcome
  • the opportunities for agencies to increase the value of public sector information
  • action the OAIC will take to assist agencies to further implement the Principles
  • international developments in the open government movement
12:20 – Helen Owens from the Office of Spatial Policy – will discuss progress on opening government geospatial data and specifically the Foundation Spatial Data Framework.
12:35 – The eGov Cluster will present some recent projects using government data for public benefits and innovation, and will discuss how governments can collaborate with the private sector.
12:50 – A short Q&A panel session with the speakers about opening government data in Australia.

Monday, February 18, 2013

Australian Industry Innovation Precincts Proposed

The Australian government has proposed up to ten Industry Innovation Precincts at a cost of $500M to "drive productivity, improve connections between business and the research sector and mobilise Australian industry to compete more successfully in global markets." Each precinct will have a research organization (university or CSIRO) as well as business, to foster mobility between academic institutions and businesses.

Available are:
  1. Executive Summary
  2. The full statement: "A Plan for Australian Jobs: The Australian Government's Industry and Innovation Statement
  3. Media release: "Industry Innovation Precincts to create jobs of the future", Media Release, Minister for Industry and Innovation, the Hon Greg Combet AM MP, and the Minister for Agriculture, Fisheries and Forestry, the Hon Joe Ludwig, 17 Feb 2013
Many local, regional and national governments have tried to reproduce "Silicon Valley", with limited success in these Silicons. In 1996 I visited Cambridge (England) to see how the technology companies around the university developed. This became known as "Silicon Fen" through a process known as the the "Cambridge phenomenon". In "Building Arcadia" I suggested how this could be emulated in Australia, laking use of locations such as the Australian Technology Park (ATP) in Sydney. Later NICTA, was set up at the ATP to foster innovation in the ICT industry. Australian governments have so far invested $1B in NICTA at several sites across Australia.

In Canberra the "Innovation ANU" program was et up to teach university students how to turn a scientific discovery into a business. This was later was was broadened to "Innovation ACT" for students at all Canberra's universities. I suggest that program could be broadened again and delivered on-line to students at all the ten new Innovation Precincts, and elsewhere across Australia. Such a program could combine nationally delivered on-line materials with local "un-conference" events, which bring people from different fields together. A good example of an unconferecne is BarCamp Canberra, this year at the Inspire Centre, University of Canberra, 16 March (purpose built for this type of learning event). A national innovation program could offer participants a formal university qualification, counting towards a degree.


ANU Exchange at City West , CanberraThe new policy mentions CSIRO, but curiously does not mention NICTA. The investment of $50M per precinct proposed in the new government policy is minimal compared to the cost of initiatives such as NICTA and CSIRO. However, this would be useful in making linkages between research and industry, if used to accelerate already emerging precincts. An example is "City West,  with the ANU Exchange development, to the west of the Canberra CBD, where the ANU campus is blending with government and private enterprises, related to education and research.

Wednesday, February 13, 2013

ICT Education Future Here Now

Greetings from the Great Hall for Parliament House in Canberra, where Prime Minister Julia Gillard,  just officially opened the NICTA "Tech-fest" and announced an agreement between NICTA and Infosys.

Earlier an "ICT Skills Panel" discussed how to get more young people into the ICT industry, what skills they should have and how they are educated. I became a little annoyed that the panelists talked as if there was no progress on ICT skills training and e-learning in Australia. As it happened I was responding to a question from one of my students for the ICT Sustainability course I run on-line at ANU and was sending out a proposal for a new coruse on "Government 2.0 Technology and Techniques".

ICT Futures in Parliament House

Greetings from the Great Hall for Parliament House in Canberra, where NICTA is holding a "Tech-fest" several hundred ICT researchers are standing in front of dozens of displays, explaining the products of their research and how it will benefit Australia. This is very important for an organization which the Australian Government have invested about a billion dollars in. The areas of research range over a wide area from the environment, transport logistics, telecommunication to e-health and e-government.

I actually come to hear from an "ICT Skills Panel", featuring Dr Stuart Feldman, VP Engineering, Google, East Coast, USA at 1:30pm. Apparently the Prime Minister is dropping in for a visit later.

ps: In my view we need to broaden ICT professionals skills and teach ICT Masters Students How to Teach On-line.

Friday, February 08, 2013

ICT Skills Panel

NICTA are hosting an "ICT Skills Panel" in Canberra on 13 February 2013. There is a good lineup of speakers and topics (I have spent the last year considering these questions while undertaking tertiary studies).

Techfest 2013 ICT Skills Panel
  • How does Austrlaia increase the ICT skills talent pool?
  • What do studnets relly think of a career in ICT?
  • What incentives are there for students considering a career in ICT?
  • How do we reboot tertiary ICT education?
  • How do we make sure we are producing graduates with the right skills for new industries and business?
MODERATOR: Prof. Simon Kaplan, Education Director, NICTA
Panelists:
  • Matt Barrie, CEO, Freelancer.com
  • Suzanne Campbell, CEO, AIIA
  • David Harrison, Managing Director, Mammoth Media
  • Lucky Katahanas, ICT Entrepreneur and App Developer
  • Dr Stuart Feldman, VP Engineering, Google, East Coast, USA
ps: Unfortunately NICTA have published the details of the event in the form of a blurry image. Whoever did this must have missed the part of their ICT course covering  the legal requirements for accessible web design.

Monday, November 26, 2012

Relevance Versus Diversity for Web Search Results

Greetings from the CSIRO ICT Centre at the Australian National University in Canberra, where I am attending a presentation on web search research by Kar Wai Lim of ANU & NICTA. The object of the exercise is to come up with web documents as the result of a search which are relevant but not all too much the same (diverse). For example, if you get ten copies of essentially the same document, that is not much use. It turns out that there is a mathematical equation which shows the trade-off between relevance and diversity. The details are published in a short paper, which I don't pretend to understand:
Kar Wai Lim, Scott Sanner, and Shengbo Guo. 2012. On the mathematical relationship between expected n-call@k and the relevance vs. diversity trade-off. In Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval (SIGIR '12). ACM, New York, NY, USA, 1117-1118. DOI=10.1145/2348283.2348497 http://doi.acm.org/10.1145/2348283.2348497

Friday, August 31, 2012

Improving Service Delivery with Research

Greetings from National ICT Australia (NICTA) in Canberra, where James Gibson has organized a asked workshop on “Inventing the Future of Service Driven Enterprises" with the Service Science Society. One point to be clarified is that the "services" being discussed here are those such as: climate services, green services, financial services, health services and water management services. Some of these will be delivered via a computer server, but most will involve personnel and equipment, not just a web based "service".

I did a quick search and could only find 11 formal research papers on "Service Driven Enterprise", out of the 54,000 documents on-line on the topic. In my postgraduate course "ICT Sustainability", I have one weekly module on "Enterprise Architecture".

The services which interest me are education and research supervision. As with many other services, these have been delivered as a craft based on techniques developed by trial and error and could benefit by a systematic analysis of what the customer wants and needs and how this could be effectively delivered.

In his presentation Mr Peter Alexander, brought up the issue of the role of innovation in government. This sparked an interesting discussion of the role of the public and private sectors. Peter mentioned the book "Fast Second: How Smart Companies Bypass Radical Innovation to Enter and Dominate New Markets" by Costas Markides at the LSE. Perhaps these techniques could be applied to the public service. The ACT Government is currently sponsoring "Innovation ACT" to teach university students how to take an idea and make it a product.

Workshop:Inventing the Future of Service Driven Enterprises

Date: Friday, August 31, 9:00am – 5:15pm ...

Canberra

Workshop Description

The Service Science Society invites you to a workshop where we argue that current difficulties in large IT and modernisation projects are not solvable with continuous improvement. A transformative approach is required, driven by a vision to invent the future, created through action, and informed by directed research. We present some fundamental architectural concepts which we believe will contribute to this transformation, and which differentiate it from previous failed attempts. To achieve the transformation we need impact through capability building and directed research to support the transition to Service Driven Enterprises. The approach extends to the design and building of custom cloud execution engines, tools and methods. This requires a different business model and research approach than generally used in Australia. Finally, we argue that this change is coming, and a panel considers how we can act now in order to place Australia at the forefront of enabling Service Driven Enterprises.

Agenda

  • 9:00am – 9:15am Introduction - Prof Aditya Ghose, University of Wollongong
  • 9:15am – 9:45pm Necessity for major change - Prof Aditya Ghose, University of Wollongong
  • 9:45am – 10:15am Challenges in Delivery of Government Products and Services – Mr Peter Alexander, Treasury
  • 10:15am – 10:45am What Desirable Characteristics do we look for - Bridging the Creativity to Engineering Gap – Mr Chris Thorne, ATO
  • 10:45am – 11:15am Morning Tea (provided)
  • 11:15am – 11:45am What Desirable Characteristics do we look for - Enterprise Architecture - where are we now? - Dr Saul Caganoff - Sixtree
  • 12:45pm – 12:30pm Inventing the Service Driven Enterprise - Transformative Change Needed - Mr James Gibson*, ANU
  • 12:30pm – 1:10pm Lunch (provided)
  • 1:10pm – 2:10pm Discussion Session: Understanding the Service Driven Enterprise What is an SDE and what’s needed to build it? - Dr Liam O'Brien*
  • 2:10pm – 2:40pm Building Service Driven Enterprise Capability - Building the capability for timely innovation, engineering and capability development to support SDE client transitions - Dr Clive Boughton*, Software Improvements
  • 2:40pm – 3:10pm The Business Case for Change - Mr Pascal Rabbath, S-3 Consulting
  • 3:10pm – 3:40pm Services Research and Direction at CSIRO - Dr Darrell Williamson, CSIRO
  • 3:40pm – 4:00pm Afternoon Tea (provided)
  • 4:00pm – 5:10pm Panel - Achieving the Future - What practical actions should we take? What are the next steps? Chair: Dr Mike Sargent, M.A.Sargent & Associates Panellists: TBD (representatives from government, industry and academia)
  • 5:10pm – 5:15pm Workshop Wrap-up and Close - Prof Aditya Ghose, University of Wollongong

Registration Required

Friday, August 24, 2012

Service Driven Enterprises

James Gibson asked me to pass on an invitation from the Service Science Society to their workshop “Inventing the Future of Service Driven Enterprises in Canberra 31 August 2012.

Workshop:Inventing the Future of Service Driven Enterprises

Date: Friday, August 31, 9:00am – 5:15pm ...

Canberra

Workshop Description

The Service Science Society invites you to a workshop where we argue that current difficulties in large IT and modernisation projects are not solvable with continuous improvement. A transformative approach is required, driven by a vision to invent the future, created through action, and informed by directed research. We present some fundamental architectural concepts which we believe will contribute to this transformation, and which differentiate it from previous failed attempts. To achieve the transformation we need impact through capability building and directed research to support the transition to Service Driven Enterprises. The approach extends to the design and building of custom cloud execution engines, tools and methods. This requires a different business model and research approach than generally used in Australia. Finally, we argue that this change is coming, and a panel considers how we can act now in order to place Australia at the forefront of enabling Service Driven Enterprises.

Agenda

  • 9:00am – 9:15am Introduction - Prof Aditya Ghose, University of Wollongong
  • 9:15am – 9:45pm Necessity for major change - Prof Aditya Ghose, University of Wollongong
  • 9:45am – 10:15am Challenges in Delivery of Government Products and Services – Mr Peter Alexander, Treasury
  • 10:15am – 10:45am What Desirable Characteristics do we look for - Bridging the Creativity to Engineering Gap – Mr Chris Thorne, ATO
  • 10:45am – 11:15am Morning Tea (provided)
  • 11:15am – 11:45am What Desirable Characteristics do we look for - Enterprise Architecture - where are we now? - Dr Saul Caganoff - Sixtree
  • 12:45pm – 12:30pm Inventing the Service Driven Enterprise - Transformative Change Needed - Mr James Gibson*, ANU
  • 12:30pm – 1:10pm Lunch (provided)
  • 1:10pm – 2:10pm Discussion Session: Understanding the Service Driven Enterprise What is an SDE and what’s needed to build it? - Dr Liam O'Brien*
  • 2:10pm – 2:40pm Building Service Driven Enterprise Capability - Building the capability for timely innovation, engineering and capability development to support SDE client transitions - Dr Clive Boughton*, Software Improvements
  • 2:40pm – 3:10pm The Business Case for Change - Mr Pascal Rabbath, S-3 Consulting
  • 3:10pm – 3:40pm Services Research and Direction at CSIRO - Dr Darrell Williamson, CSIRO
  • 3:40pm – 4:00pm Afternoon Tea (provided)
  • 4:00pm – 5:10pm Panel - Achieving the Future - What practical actions should we take? What are the next steps? Chair: Dr Mike Sargent, M.A.Sargent & Associates Panellists: TBD (representatives from government, industry and academia)
  • 5:10pm – 5:15pm Workshop Wrap-up and Close - Prof Aditya Ghose, University of Wollongong

Registration Required

Monday, December 19, 2011

Human-centered Computing Talk in Canberra 5 January 2012

Professor Nicu Sebe, University of Trento (Italy) will speak on "Human-centered Computing: Challenges and Perspectives" at the NICTA Seminar Room, 7 London Circuit, Canberra, 5:00 pm, 5 January 2012.
IEEE ACT - Human-centered Computing - IEEE Computer Society seminar in Canberra

Title: Human-centered Computing: Challenges and Perspectives

Date: Thursday 5 January 2011
Time: 5:00 pm - 6:30 pm
Location: NICTA Seminar Room, Ground floor, 7 London Circuit, ACT 2601

Abstract: Human Centered Computing (HCC) is an emerging field that aims at bridging the existing gaps between the various disciplines involved with the design and implementation of computing systems that support people's activities. HCC aims at tightly integrating human sciences (e.g. social and cognitive) and computer science (e.g. human-computer interaction (HCI), signal processing, machine learning, and computer vision) for the design of computing systems with a human focus from beginning to end.

This focus should consider the personal, social, and cultural contexts in which such systems are deployed. In this presentation, I discuss the existing challenges in HCC and describe what I consider to be the three main areas of interest: media production, analysis (especially retrieval issues), and interaction. I will present my current research and how this is reflected into the HCC paradigm. In addition, I will identify the core characteristics of HCC, describe example applications, and propose a research agenda for HCC.

Bio: Nicu Sebe is a Professor with the Faculty of Cognitive Sciences, University of Trento, Italy, where he is leading the research in the areas of multimedia information retrieval and human-computer interaction in computer vision applications. He was involved in the organization of the major conferences and workshops addressing the computer vision and human-centered aspects of computer vision and human-centered aspects of multimedia information retrieval, among which as a General Co-Chair of the IEEE Automatic Face and Gesture Recognition Conference, FG 2008, ACM International Conference on Image and Video Retrieval (CIVR) 2007 and 2010, and WIAMIS 2009 and as one of the initiators and a Program Co-Chair of the Human-Centered Multimedia track of the ACM Multimedia 2007 conference.

He is the general chair of ACM Multimedia 2013 and a program chair of ACM Multimedia 2011. He has served as the guest editor for several special issues in IEEE Transactions on Multimedia, IEEE Computer, Computer Vision and Image Understanding, Image and Vision Computing, Multimedia Systems, and ACM TOMCCAP. He has been a visiting professor in Beckman Institute, University of Illinois at Urbana-Champaign and in the Electrical Engineering Department, Darmstadt University of Technology, Germany. He is the co-chair of the IEEE Computer Society Task Force on Human-centered Computing and is an associate editor of IEEE Transactions on Multimedia, Machine Vision and Applications, Image and Vision Computing, Electronic Imaging and of Journal of Multimedia.

Welcome ! Please feel free to invite your colleagues to this event.

IEEE ACT web page is http://www.ieeeact.org/

Tuesday, March 29, 2011

Last-Mile Disaster Preparedness and Recovery

Professor Pascal Van HentenryckProfessor Pascal Van Hentenryck, Brown University ( USA) will speak on "Last-Mile Disaster Preparedness and Recovery" at NICTA in Canberra, 6 April 2011:

Last-Mile Disaster Preparedness and Recovery

Prof. Pascal Van Hentenryck (Brown University)

NICTA SEMINAR

DATE: 2011-04-06
TIME: 12:00:00 - 13:00:00
LOCATION: NICTA - 7 London Circuit
CONTACT: Sylvie.Thiebaux@anu.edu.au

ABSTRACT:
Every year, natural disasters cause infrastructure damages and power outages that have considerable impacts on both quality of life and economic welfare. Mitigating the effects of disasters is an important but challenging task, given the underlying uncertainty, the need for fast response, and the complexity and scale of the infrastructures involved, not to mention the social and policy issues. This talk describes how to use planning and scheduling technologies to address these challenges in a rigorous and principled way. In particular, we present the first optimization solutions to last-mile disaster preparedness and recovery for a single commodity (e.g., water) and for the electrical power network. The optimization algorithms were compared to existing practice on disaster scenarios based on the US infrastructure (at the state scale) and generated by state-of-the-art hurricane simulation tools. Some of our algorithms are deployed as part as the Los Alamos National Laboratories operational tools and provide recommendations to the U.S. Department of Homeland Security.
BIO:
Pascal Van Hentenryck is a Professor of Computer Science at Brown University. He is a fellow of the Association for the Advancement of Artificial Intelligence, the recipent of the 2002 ICS INFORMS award, the 2006 ACP Award, a honorary degree from the University of Louvain, and the Philip J. Bray award for teaching excellence. He is the author of five MIT Press books and most of this research in optimization software systems has been commercialized and is widely used in academia and industry.

Monday, February 21, 2011

Google Page Rank Sparks New Computing Paradigm

In Map-Reduce and its Children today at NICTA in Canberra, Professor Jeffrey Ullman discussed the wider uses for the approach Google uses for calculating "PageRank" for web sites:
Abstract: Since its publication by Google researchers in 2004, Map-reduce has proven to be a significant advance in programming methodology that offers resilient, easy-to-code parallel computation on a modern computing cluster or "cloud." It has led to a variety of systems that improve it in different ways. One direction is raising the level of abstraction, especially to support relational-database operations. A second direction is increasing the generality, while maintaining the programmability and resiliency in the face of partial failures. We shall review the environment of NICTA a map-reduce system, give some examples of how it works, and discuss the various extensions and the technical problems they posed.
From: Map-Reduce and its Children, NICTA, 2010
Professor Ullman started by introducing Distributed File Systems, as popularised by Google. He described how they are large (many Terabytes), divided into chunks (typically 64mbytes). These databases are usually added to (with few random updates). Chunks are replicated at compute nodes in racks connected by switches. Apart from the Google File System (GFS), HDFS and CloudStore.

On the distributed file system is built Map-Reduce, Key-Value stores and SQL. Map-reduce (such as Hadoop) is alongside the object store (key value pair).

M
ap reduce allows for large numbers of highly parallel processing tasks. It must allow for failure of nodes without having to restart the entire job. Previously failure could be though of as a rare event, but with tens of thousands of nodes, it is common.

Key-Value pair databases (a throwback to pre SQL relational databases?). SQL-like implementers are PIG, Hive, Sawzall and Scope.

Map-reduce has two functions: map and reduce. Large numbers of tasks are created for each to run in parallel and the results combined. Each map task operates on a distinct portion of the input file. The map tasks produce pairs of values. These are then passed to the reduce tasks, with pairs with the same "key" pass to the same reduce task, where they are combined. The result of the reduce tasks then are placed in the file store.

Clearly this design is intended for tasks such as indexing web pages for searching. It is good for matrix multiplication and relational algebra (such as a natural join). It assumes there are not many updated of data in place, just data added (that assumption works well for many web applications).

The process for dealing with a failed task is simple: restart the task. Professor Ulman commented that on real systems about 5% of failures are due to software problems (such as the version of Java not being upgraded).

Map reduce can be generalised to allow any number of tasks to be strung together. Despite the complexity, such a system can be designed to be fault tolerant: either the processing is still underway or it has finished successfully. This sounds good, but still seems to be at the experimental stage. The simpler map-reduce is easier to get to work as there are just two tasks before the results are safely stored in the file system.

It would be interesting to look at how generally applicable these applications are. Normally you would think these would work for web applications, such as indexing web pages, but not traditional databases, such as bank records. Financial records in a bank require frequent update in place of data (such as the current balance). However, banks have to keep transaction records. Normally these transaction records are seen as less important that the current updates and just something needed occasionally. But if the client has access to their bank account on-line, they are more likely to be looking at the history. This makes the transaction records important. Also "cloud" based applications may result in more aggregations of data which suit this approach.

The Professor argued that recursion is now key to web based applications, such as Page-Rank and web structure. However, this appears an area for research, not practical implementation.

Friday, February 04, 2011

Teaching Cloud Computing to Google

Professor Jeffrey Ullman will sepak on Map-Reduce and its Children, 21st February 2011 in Canberra.
Professor Jeffrey Ullman
Map-Reduce and Its Children

Abstract
Since its publication by Google researchers in 2004, Map-reduce has proven to be a significant
advance in programming methodology that offers resilient, easy-to-code parallel computation on
a modern computing cluster or "cloud." It has led to a variety of systems that improve it in different ways. One direction is raising the level of abstraction, especially to support relational-database operations. A second direction is increasing the generality, while maintaining the programmability and resiliency in the face of partial failures. We shall review the environment of a map-reduce system, give some examples of how it works, and discuss the various extensions and the technical problems they posed.

Biography
Jeff Ullman is one of the world's best known computer scientists. His contributions to theoretical London Circuit, Canberra computer science and its applications to compilers, parallelism, and databases, as well as his
textbooks in those areas have been widely acknowledged, notably by the 2000 Knuth Prize and the 2010 IEEE John von Neumann Medal. He was the advisor of an entire generation of PhD students, including Sergey Brin, one of the co-founders of Google. Currently, Jeff Ullman is the Stanford W. Ascherman Professor of Computer Science (Emeritus) as well as CEO of Gradiance, a corporation which he founded to provide online homework and programming lab support for College students. Prof. Ullman’s research interests include database theory, database integration,
data mining, and education using the information infrastructure.

Thursday, November 25, 2010

Femtocells for Wireless Broadband

Professor Mark Reed, Principal Researcher, NICTA, talked on "Next Generation Wireless: How can we solve the data crunch?" at the Australian National University in Canberra, this morning. This was timely as I was at a workshop with people from the Bangladesh government discussing "e-Government for Developing Nations". One key point is that wireless allows skipping generations of telecommunications technology.

Mark argued that Femtocells can provided greatly expanded broadband wireless and outlined research areas. However, it seemed to me the major issue was how this technology could be integrated into the business models of telecommunications companies (as illustrated by the lack of a business case holding up progress with the National Broadband Network). In other words the research question is: "How do we make money out of this?". As an example, one way would be for the customer who buys a femtocell to share in the revenue from others using the cell (this happens with some public WiFi systems).

Previously I suggested wireless be built into the NBN modems installed in homes. This would provide public femtocells and a very profitable supplement to the wired service.

One technical area for research is how to carry video efficiently. As Mark pointed out video is the major driver for wireless use. But video has very different characteristics to voice transmission and web access. It should be feasible to make video hundreds of times more efficient with a few simple protocol tweaks. Changes to the network topology would make it hundreds of millions times more efficient. Some changes are relatively simple, such as changing packet sizes and priorities, some will require hardware changes, such as putting caching in the cells. While there are many millions of videos which people might watch, there will be a relatively small number which most people will be watching at one time. Also the system can anticipate what people will want to watch and download it when there is spare network capacity.

Another issue is the use of mesh networks. With this arrangement, the consumer's handsets and base stations can communicate with each other, supplementing the fixed infrastructure. This could be used with intelligent and predictive caching.

Tuesday, November 23, 2010

Electronic Patient Records for Better Health

Professor Hercules Dalianis and Sumithra Velupillai, from Stockholm University will speak on how e-health records can assist doctors, 11am, 8 December 2010, at NICTA in Canberra:

How can we use clinical corpora to assist the clinician, her managers and clinical research? & Modeling factuality levels of diagnoses in Swedish clinical records for information access

Professor Hercules Dalianis and Licentiate Sumithra Velupillai (Department of Computer and Systems Sciences, (DSV), Stockholm University, Sweden)

NICTA SML SEMINAR

DATE: 2010-12-08
TIME: 11:00:00 - 12:30:00
LOCATION: NICTA - 7 London Circuit
CONTACT: Hanna.Suominen@nicta.com.au

ABSTRACT:
Speaker: Hercules Dalianis, Department of Computer and Systems Sciences, (DSV), Stockholm University, Sweden

Title: How can we use clinical corpora to assist the clinician, her managers and clinical research?

Abstract Today a large number of Electronic Patient Records (EPRs) are produced for legal reasons but they are never reused neither for clinical research nor for business (hospital) intelligence reasons. Moreover, it is also alarming that the clinicianas daily work in documenting the patient status is rarely supported in a proper way. We are aiming to change these facts. Clinical corpora form an abundant source to extract valuable information that can be used for this purpose.

The Stockholm EPR Corpus is a huge clinical corpus written in Swedish, containing over one million patient records distributed over 800 clinics encompassing three years from the Stockholm area. We have explored subsets of this corpus with the aim of understanding the whole corpus and its domain(s). In one experiment we annotated a subset of the corpus for de-identification, and we created a gold standard for training and evaluation of automatic de-identification tools. In another experiment we investigated the relations of diagnosis codes (ICD-10) for co-morbidity analyses and found interesting results. We have also developed a method for automatic support in assigning new ICD-10 codes on newly entered clinical text, but also for evaluating already assigned ICD-10 codes. Finally we have tried to understand what exactly is written in the corpora, with the aim to construct information extraction tools that can distinguish between the factuality of diagnoses. Is the diagnosis certain, negated, or uncertain to some extent? Two annotators with clinical background have annotated a subset of the corpus for factuality levels.

Speaker: Sumithra Velupillai, Department of Computer and Systems Sciences, (DSV), Stockholm University, Sweden

Title: Modeling factuality levels of diagnoses in Swedish clinical records for information access

Retrieving relevant information from Electronic Patient Records (EPRs) is a challenging task, since there are different information needs in different situations. Moreover, this document type is a good example of where traditional information retrieval methods such as those applied in search engines are not sufficient; simply searching for keywords and retrieving ranked lists of documents is an insufficient way of exploiting the knowledge, experience and information contained in EPRs.

We have initiated the creation of Swedish clinical corpora manually annotated for factuality levels in clinical records. The first experiment was carried out on a sentence and token level, manually annotated by three laymen. Following this work, a second experiment has been carried out, focusing on assessment descriptions from clinical records from an emergency department. In this task, the annotators (clinicians) are given a diagnosis to be judged for factuality levels (certainly, probably and possibly positive or negative).

The created corpora will be used for automatic classification experiments. We want to be able to answer the following questions: is it feasible to automatically classify factuality levels of diagnosis descriptions in Swedish EPRs? Which diagnosis types are harder to judge when it comes to factuality levels? Which features are indicative? Are these different depending on diagnosis types? Are some diagnoses inherently uncertain/speculative? What does this imply?

In the future, we envisage information access systems that are able to distinguish these types of factuality levels automatically, information that could be utilized in information access systems, (semi-)automatic summarization applications, hypothesis generation and clinical research, etc.
BIO:
Dalianis is an associate professor (docent) and tenured lecturer (universitetslektor) at the Department of Computer and Systems Sciences (DSV) at Stockholm University, Sweden where he heads the research area IT for Health. Dalianis received his Ph.D in 1996. Dalianis was a post doc researcher at University of Southern California/ISI in Los Angeles 1997-98. Dalianis held a three-year guest professorship at CST, University of Copenhagen during 2002-2005, founded by Norfa, the Nordic council.

Dalianis works in the interface between university and industry with the aim to make research results useful for society. Dalianis has specialized in the area of human language technology, to make computer to understand and process human language text, but also to make a computer to produce text automatically. Examples on applications are automatic text summarization and search engines with built in human language technology support as for example stemming, spell checking, compound splitting to improve the information extraction. Currently Dalianis works in the area of text mining and medical informatics focused on electronic health records. Dalianis has more than 20 years of experience of his research area. Dalianis has been project leader and received funding for over 15 national, Nordic and European research projects.

Velupillai is a PhD student at the Department of Computer and Systems Sciences at Stockholm University since April 2007. She successfully defended her Licentiate Thesis Swedish Health Data a" Information Access and Representation on the 6th of October, 2009. Velupillai is also affiliated with the Swedish National Graduate School of Language Technology (GSLT), has participated in several research projects, and is currently part of the Nordic research network HEXAnord. Velupillai has a background in Computational Linguistics and specializes on research covering both Language Technology, Information Access and Health Informatics. Velupillai has published and presented eighteen articles in renowned international conferences and journals.

Friday, November 19, 2010

Next Generation Wireless

Professor Mark Reed, Principal Researcher, NICTA , will talk on "Next Generation Wireless: How can we solve the data crunch?" at the Australian National University in Canberra, 11am, 25 November 2010. Wireless in one of the options for completing the National Broadband Network (NBN). This is a free talk, no need to book, just turn up:
Next Generation Wireless: How can we solve the data crunch?

Assoc. Prof. Mark Reed (NICTA)

APPLIED SIGNAL PROCESSING SERIES

DATE: 2010-11-25
TIME: 11:00:00 - 12:00:00
LOCATION: RSISE Seminar Room, ground floor, building 115, cnr. North and Daley Roads, ANU
CONTACT: charlotte.hucher@cecs.anu.edu.au

ABSTRACT:
With the forecast exponential growth of mobile broadband over the next few years and the availability of 3G mobile systems that are spectrally efficient, there remains a question of whether 4G systems will solve the key problems of coverage, throughput, and cost. This talk will discuss next generation wireless systems taking insight and input from commercial drivers and needs. It will explore the "data crunch" issue driven by smartphones and social networking and highlight that spectrum allocation and the deployment of 4G/LTE will not alone solve the problem. Interestingly, this initiates a lot of new and interesting research problems that havn't been explored in any depth by the research community, including small cell (femtocell) technology and self organising network (SON) technology.


BIO:
Mark Reed is a leading researcher in the area of WCDMA receiver and network design with more than 18 years of experience with positions in the USA, Switzerland, and Australia. He received his B. Eng. (Honours) from RMIT in 1990 and Ph.D. in Engineering from University of South Australia in 2000. He is an Adjunct Assoc. Prof. at the Australian National University and a Principal Researcher and Project leader at NICTA where he has been since 2003 and leads a team on a research-inspired commercial project. Mark pioneered the area of iterative detection techniques for WCDMA base station receivers and has more than 60 publications and eight patent applications. He has a mix of real-world industrial experience as well as research experience where he continues to put his techniques into practice. Mark has previously performed research and developed real-time world- first Satellite-UMTS and mobile WiMAX demonstration systems. Recently Mark has lead a team to realize a real-time WCDMA Femtocell modem working at RF and tested against independent equipment. This realization contains world-first advanced receiver techniques that significantly improve the uplink throughput and range. Mark is a senior member of the IEEE and from 2005-2007 he was an Associate Editor for the IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY..

Tuesday, August 31, 2010

Sharing Entrepreneurial Experiences with the Canberra Community

Brand HoffBrand Hoff, founder of TOWER Software (1985) will speak on "Sharing Entrepreneurial Experiences with the Canberra Community", at NICTA, Canberra, 14th September 2010.
Meet the Founder Series
Sharing Entrepreneurial Experiences with the Canberra Community
Mr Brand Hoff
“The TOWER Software Story from Micro to Multi”
Brand Hoff founded TOWER Software in 1985, he grew the company from two people to 240 people worldwide. In 2008, the company was successfully sold to
by Thursday 8th September Hewlett Packard.

When: 5.30pm for canapes Tuesday 14th September
NICTA Seminar Room, Ground Floor, 7 London Circuit, Canberra
RSVP: crlevents@nicta.com.au

Mr Brand Hoff brings significant expertise to NICTA in the areas of information technology, package software research and development, Web applications research, small-and-medium enterprises (SMEs), and commercialisation.

Mr Hoff has 40 years in the IT industry. In the early 1970s he developed a private data entry LAN and data entry equipment for the Commonwealth Treasury.

Mr Hoff also designed and developed one of the first national, and later international, communications networks for Treasury.

He was responsible for the purchase of the first IBM plug-compatible mainframe for the Commonwealth Government .

As Director Computer Services was responsible for the development of the Australian Bibliographic Network for the National Library of Australia.

Between 1981 and 1985 Mr Hoff established the Canberra Consulting Division of CSC, a large US systems consulting company.

Until 2001 he was Managing Director of TOWER Software Engineering, a business he founded in 1985 and which he grew to be a Canberra-based multi national. TOWER Software, famous for its TRIM product, is a leading enterprise content management firm with overseas subsidiary companies in the United States and United Kingdom and offices in Canada, New Zealand, Northern Ireland and Holland.

TOWER has achieved outstanding success in the development of the TRIM product, including three prestigious AIIA iAwards. Other awards include NSW Emerging Exporter of the Year 1997, Australian Capital Territory (ACT) Small Business of the Year 1998 and National Winner AusIndustry Innovation Award 1998.

TOWER was sold to Hewlett Packard in May 2008 after a formal takeover offer was tabled by HP.

Mr Hoff has a BA degree in Computer Studies from the University of Canberra and is fellow of the Australian Institute of Company Directors.

He has served as Chairman of the Information Industry Development Board and as Chairman of the Knowledge Based Economy Board which advises the ACT Chief Minister and ACT Treasurer. Mr Hoff was a Member of the University of Canberra Council and holds a number of additional Board positions on ICT based SME Companies.

Tuesday, April 06, 2010

Implant Technologies for bionic eyes and ears

Dr John Parker will talk on “The Implant Systems Experiment” in Canberra, Wednesday 7th April 2010.
Big Picture Seminar Series
Increasing the chances for commercialisation of Research “The Implant Systems Experiment
Dr John Parker, CTO, Implant Systems NICTA
When: 12.15pm for 12.30pm start Wednesday 7th April
Where: NICTA Seminar Room, Ground Floor, 7 London Circuit, Civic
RSVP: crlevents@nicta.com.au by Tuesday 6th April

Dr Parker spearheads NICTA’s Implant Technologies Group. He has been an Executive Director and CTO of Cochlear Limited and is an experienced director of both listed and non listed companies and CRC’s.

The power of modern electronics and computer science has far exceeded our ability to deploy them in active medical devices. Complex systems for sensing and
controlling and affecting a therapeutic biological response are “easily” demonstrated on the bench but defy continual operation inside the body and as
a result even the most technically advanced implantable devices remain very simple.

The architecture of neuro-modulation systems hasn’t changed in 25 years despite the technical advances in allied disciplines. This stagnation is due to a lack of a number of platform technologies, tissue interfaces,
packaging and scalable system architectures.

It is the development of these technologies, which are the focus of the research effort at NICTA’s Implant Systems group.

From: The Implant Systems Experiment, Big Picture Seminar Series, NICTA, 2010