Thursday, March 23, 2006

Ordnance Survey. Planning for the Future

Ordnance Survey has published its strategy for the next two years. How
much will the face of public sector information have changed in that time?

by Kate Worlock EPS, Director

Ordnance Survey’s “Geographic Information Strategy 2006-8” document
contains few surprises. As befits the OS’s status as both a national
mapping agency and a self-funding government department, the strategy is
based around a two year business plan which aims to deliver on OS’s
strategic goal: “to provide the underpinning foundations on which Great
Britain’s GI [geographic information] infrastructure and market can
develop in a coherent, rational way”. One of the key components of this
strategy is OS MasterMap, a geographic database consisting of around half
a billion topographic features, built to be compatible with web standards
and able to be ordered through a web interface.

The value of a tool of this sort when it comes to delivering joined-up
government is clear, but arguments continue to rumble about the nature of
public sector information in the UK. The Guardian newspaper launched a
‘Free Our Data’ campaign the week before the publication of the OS
strategy document, which aims, “to persuade the government to abandon
copyright on essential national data, making it freely available to
anyone, while keeping the crucial task of collecting that data in the
hands of taxpayer-funded agencies”. The Guardian’s argument rests on two
tenets. The first is that governments should not run businesses because
they are too inflexible to cope with the demands of a rapidly changing
commercial environment – this is particularly apposite given the speed of
change within the networked economy. The second is that governments must
be relied upon to collect the best data – few commercial organisations
have the scale to undertake these tasks, and if they do they are liable to
ignore less profitable sectors or areas.

One piece of evidence put forward by the Guardian is a Pira International
report compiled for the European Commission in 2000, which found that a
more open approach to government-collated data, such as that espoused by
the US, is hugely beneficial economically. The report argued that the US
and the EU are comparable in size, but while the EU spent E9.54 bn on
collecting public sector data and generated E68 billion by selling it, the
US spent twice as much on collection (E19 bn) yet earned more than 10
times as much from its sale (E750 bn). Further economic arguments come in
a paper from the late Peter Weiss of the US National Weather Service, who
argued that governments not only attract higher tax revenues from
increased sales of products that incorporate public sector information,
but also benefit from higher income tax revenues and lower social welfare
payments from net gains in employment.

There is also an argument that allowing free access to public sector
information encourages innovation and competition within the market –
services like Google Maps, Yahoo! Maps, and Microsoft MapPoint, and the
mashups that have developed around them, support this point.
Interestingly, all of these services are based around licensing commercial
mapping data from either Tele Atlas or Navteq. These intermediary players
use government data as an integral part of the mix, but then enhance it
considerably using fieldwork and other techniques – Tele Atlas, for
example, captures video images using vans which drive across Europe and
the U.S., enabling the company to process changes and updates with higher
quality and five times faster than in the traditional methods.

The Guardian’s campaign is only one of a number of initiatives concerning
public sector information in the UK. Others include APPSI (the Advisory
Panel on Public Sector Information) and OPSI (the Office for Public Sector
Information), both of which report to the Cabinet Office [Disclosure – EPS
Chairman David Worlock is a member of APPSI, representing online
publishers]. The organisation that is perhaps the most likely to
influence future direction is the Office of Fair Trading (OFT), which is
undertaking a market study due for publication in summer 2006. The OFT
might do well to look at other publishing markets in the course of
compiling its findings – Free Our Data sounds very much like the cry of
Open Access advocates in scholarly communications, and the arguments are
similar in many ways. Government really has two options: to sell
information whose creation is paid for by the taxpayer to those taxpayers
who can pay market prices for it, or to have this information made
available at low or no cost (covering the cost of copying at most) so that
enterprising businesses can use it to create well-used services that feed
the tax base and satisfy user needs. The latter seems to have the weight
of economic and social argument behind it, but whether this will be
reflected in the OFT’s report remains to be seen.

Sunday, March 19, 2006

So what is Structured Visual Thinking?

Structured Visual Thinking™ is the definition we use to describe the ‘physics’ involved when synthesising complex and disparate content. The term describes techniques for aggregating conversation and our use of frameworks deployed together with precisely designed tools designed to force clarity from chaotic and random information or knowledge. Structured and visual thinking creates the opportunity for clear thinking. The definition covers both the process of facilitation and co-creation and the use of proprietary templates, blueprints, frameworks, grids, schemas, designs, stimulus materials, diagrams, devices, sets and frames.

Structured Visual Thinking™ has significant value for clients in problem definition, prioritisation of alternate choices, clarity of ranking in creating preferential or time based decision making and planning, defining the state of current realities and describing the vision of future states. The approach uses a mixture of pre-built and co-created structures, frameworks that will enable clarity and practical outcomes to be more measurably sustained.