| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

SDI Architecture Workshop Review Notes

Page history last edited by chris.higgins@... 15 years, 6 months ago

An MS Word copy of the following document can be downloaded here

 

 

 Academic SDI Technical Architecture Workshop

 

 

Review Notes from Workshop on 29 August 2008 at EDINA

prepared 5 September 2009, draft 1.0

by R A Longhorn, Facilitator

 

 

Introduction and Background.

 

 

This review is not presented as a “he said, she said” verbatim meeting report, but rather lists key comments and issues raised throughout the day. The structure of the workshop followed the agenda found in Annex 1.  The session was introduced by EDINA, with a brief report on the prior workshop of the 15th of May, of which the current workshop was a follow-up action, looking more specifically at “academic SDI” as opposed to geodata needs, uses, future, etc., which was the focus of the 15 May workshop, attended by 25 people from across the academic sector.  Both workshops took place under the auspices of the e-Framework Work Package within the JISC funded SEE-GEO project.

 

 

Roger Longhorn acted as facilitator, working through two sets of slides (Annexes 2 and 3), focusing on the “Data and Content” and “Technology and Tools” results from the prior workshop, as general guidelines to the range of “academic SDI” issues that were the focus of the workshop on 29 August.

 

 

Participants and contact details are in Annex 4. Annex 5 contains some supplementary material. Annex 6 contains the “Consolidated List of Recommendations” from a January 2007 JISG GWG Vision document, many of which are very pertinent to, and reflect recommendations emerging from, the workshops of 15 May and 29 August, which lend further support to these earlier recommendations – and raise questions as to what has happened in regard to implementing them.

The goal of this workshop was twofold:

1.      Establish the main elements of the technical architecture of the UK academic sector SDI.

2.      Identify key steps necessary to progress the UK academic SDI.

The main outputs will be a list of recommendations relating specifically to on the needs for, and content of, an “Academic SDI” (ASDI), that can be put forward to JISC for further action. The next phase will be to produce a roadmap to inform decisions on where JISC, and possibly the research councils, should make strategic infrastructure investments to further develop its spatial data infrastructure, thus enhancing both the research and teaching capacity of UK academia. The draft will be circulated to all participants from both workshops for further comment, then a final report will be produced and circulated more widely to the academic community and presented to JISC.

One conclusion, to which all participants seemed to agree, was that JISC should be more involved in on-going development of the UK’s Location Strategy (the UK’s spatial data infrastructure strategy) as well as liaising with DEFRA, who are responsible for advising government on how to implement the INSPIRE Directive (Infrastructure for Spatial Information in the European Community) over the coming years, as INSPIRE Implementing Rules are published.

 

 

Discussion Issues and Comments.

 

 

The initial position was that there are special needs for an ASDI, since the geodata needed is for education (pedagogical) and research purposes. The activities of the academic community regarding geodata are not identical to geodata needs in government generally or business.

 

 

Why is ASDI unique compared to other SDIs (thematic, sectoral, national)?

 

 

·         Much of the geodata in the ASDI is obtained under licence from government funded agencies for academic sector use, i.e. education (teaching) and academic research, as opposed to commercially funded research).

·         Both education (teaching) and academic research are in the national interest and it is widely appreciated that the sector merits special treatment in respect of access to data.  This means that a broad spectrum of data may or should be made available at reduced or no cost

·         Unlike SDIs servicing other sectors, the ASDI needs to accommodate non production strength components.  For example, web services which are the subject of ongoing research yet which the owners of wish to expose to the wider community.

·         The ASDI must have the flexibility to allow experimentation and innovative use of the ASDI.  Most SDIs are designed in a less flexible manner than this.

·         A ‘proto’ ASDI already exists, for example the services developed by the JISC National Data Centres EDINA and MIMAS, over the past several years. However, many of these were developed with a specific mode of access in mind i.e. through applications delivered to users through web browsers. They are not interoperable (mainly due to restrictions imposed by data licences) and, where interoperability exists at a technical level, this is based on legacy standards or earlier versions of current open standards.

·         Rather than look only at what is different or unique about an ASDI, we also need to look at the many similarities to other SDI components, e.g. the requirements of an ASDI could be met through use of published standards for metadata, interoperability of services, etc.

·         A key issue is to ensure interoperability, eg, with NERC Data Grid, but not only among academic geodata users but also with geodata services outside the academic realm since non-academic services will provide access to content and services of interest to users in academia.  Hence the need to be cognisant of activities relating to the UK Location Strategy, UK implementation of INSPIRE, and existing regional UK SDI strategies (Wales, Northern Ireland, Scotland).

·         Since UK academic research often involves partnering with institutions outside the UK, and since even totally UK-based research activities need access to ‘foreign’ datasets, models and services arising outside the UK, this is another reason that interoperability with other SDIs is important, including those being developed outside the UK.

 

 

Formal versus Informal SDI?

 

 

A lengthy discussion then developed (and was revisited during the workshop) on the differences between “formal” and “informal” geodata infrastructures, where components of a ‘formal’ SDI were described as those that, for example, followed published and accepted standards (enabling interoperability) versus a looser structure that mandated minimal or no standards and allowed organic evolution.  This latter approach is more likely to be driven from the ground up, and may be instigated by general purpose ‘mapping’ services, such as Google Earth/Maps, Yahoo, Microsoft Virtual Earth, etc. (referred to by some as GYM.)  The logical conclusions for these general purpose mapping services is to start to connect to established networks (for example INSPIRE) and each other.  These moves are usually demand driven rather than anticipatory, defacto, less structured and more flexible.  These new services are hugely popular and are making geospatial information publicly accessible.  Traditional SDIs should take advantage of them and develop ways to engage with them in order to maximise the potential and use of geospatial information in the future.

 

 

“Formal” versus “Informal” SDIs.

·         “80% of archaeological data would probably fall in the ‘informal’ category” and much is presented to students, researchers and the public via services such as Google – because they are easy to use with a small learning curve. Much research arising in the medical field was also being viewed using ‘informal’ GIS, such as Google, for presentation purposes and decision-making, since it was (a) easy to do this and (b) medical researchers have neither the time nor interest in learning ‘GIS’ skills, as well.

·         Services provide by GYM and others are drawing an ever increasing number of (new) users into geodata/map awareness (which has to be a good thing, especially for educators?), partly because of their ease of use and easy access e.g. users don’t have to register.

·         A more formal approach to geodata use is required for much academic research, especially in the physical sciences, where the data can be more complex and spatial analysis, modelling, etc. are important, compared to purely “publication oriented” services such as offered by GYM. Standards are more important. Interoperability (the ability to use multiple datasets and services from multiple sources) is important as is a desire to use quality data from an assured source.

·         Civil engineering students, archaeology students, medical researchers are all using information systems, such as provided by GYM, today because (a) they serve a purpose and (b) are easy to use. The downside is that datasets made available via such ‘informal’ services are seldom interoperable, not easy to find, no guarantee of preservation (a topic visited later in the day), no formal metadata, etc.

·         The reason informal systems are being used is the “huge barrier to entry (to using geodata) at the technical level” for traditional SDI (referring to the steep learning curve in using formal GIS tools and the difficulty in complying with SDI standards). There needs to be a different – or additional? - set of tools developed that make it easier for students – and researchers – who are not, and don’t want to be, GIS experts, to still be able to use the geospatial attributes in their data. Likewise the lack of support in the use of more advanced tools is a barrier. A researcher who decides to use such technology can quickly run into problems because of the lack of local expertise to support him or her.

·         One might consider ‘throwing away all that went before and then just using Google” – but then Google itself would have to be made more interoperable, more ‘formal’, if it was to (a) suit the needs of the research community and (b) fit into existing and developing national and thematic SDIs, all of which now follow formal implementation regimes (standards, protocols, etc).  In other words, they would end up reinventing the standards.

·         Maps and data provided by the likes of GYM are useful to provide ‘backdrops’ for presenting geodata to decision makers or others who are not GIS technical specialists – and who don’t want to be – but not for more advanced forms of use. Even with the ‘backdrop’ forms of user there are still usage restrictions that educators and researchers need to be aware of – and which many seem not to be. For example, a tutor who produces a class exercise based on the google map API and runs it on a local intranet would be breaching Google’s terms of use.

·         Academia – and formal SDI developers? – need to engage more fully with the GYM user community to tap into the energy and ideas being expressed and better fulfil their role as educators and researchers.

·         Formal SDI developers need to engage more fully with GYM and their user communities so that SDI resources are used as effectively as possible.

·         An Academic SDI seems to be more like a National (formal) SDI than a thematic one, e.g. in the marine or meteorological arena. This is because the range of uses and users is very wide, there are numerous thematic groups who would use an ASDI. There is therefore considerable overlap in the functions or components of an SDI between academic requirements and those described in most NSDI strategies and implementation plans, including INSPIRE.

 

 

Metadata

 

 

·         In the majority of cases, geospatial datasets created by academics are a by product of their research and not its main purpose. Most research councils require data collected as part of a research project to be deposited with one of their research institutes or archives. Not all data is accepted and not all data is offered up to for archiving.

·         Question arises of how to create metadata for an “informal dataset” created by a person without training in standards – or perhaps even the desire to ‘waste time’ creating metadata – “whatever that is” (metadata) attitude from many users, especially new to the scene.

·         Creating (formal) metadata is necessary for not only primary datasets, but also for derived datasets – where the task can be more difficult, especially relating to provenance metadata, etc.

·         As applications become more complex and the desire to share data increases we see a paradox emerge. “We want more and more metadata, but want to devote less time to creating it – and without special skills needed to do so.” Is the answer to be found in better metadata creation tools or services?

·         There was general agreement that there must be data at the back end of metadata geoportals/catalogues – not simply some often out-of-date references to a dataset that is no longer generally accessible.

·         The basic principle must be to encourage metadata creation as much as possible make it as easy and minimal as possible and provide the training and tools needed to make this feasible for as large a part of the academic community as possible. Existing tools mentioned were GeoDoc in Go-Geo! and MetaGenie. Some GIS applications also provide tools to create metadata..

·         Metadata creation tools need to be bundled with applications, and offer ease of use.

·         There is a potential problem with metadata becoming larger that the data itself. How to overcome or prevent this – or is it inherent to the nature of how metadata standard are now being defined? Can profiling help resolve this?

·         None of the above is going to happen without researchers, schools and institutions taking the issue of data management more seriously. There is a need for appropriate institutional policies with respect to data management, which includes geospatial data. Data management planning and implementation will need to occur at a policy level (overall frameworks), within institutions (alignment of institutional practice with national, international and discipline policies and best practices), and within research groups (compliance with discipline practices as well as national and institutional requirements).

·         The merits of informal metadata in the form of user tagging/annotation were discussed.  This approach has the significant benefit of only requiring a few minutes and having a completely flexible format, which makes people more inclined to use it because it does not require them to think too hard.

 

 

Data.

·         No consultation on which core geospatial datasets academic sector users require access to has taken place.  The multi-disciplinary nature of activity in the academic sector could make the identification of core datasets difficult. The suggestion was made that the INSPIRE themes be used.

·         Using new technology, we are creating new types of datasets that need to be shared – more openly. Informal systems such as Google make this possibly more easy than the more formal, structured GIS technology, at least for non-technicians.

·         Citing of geo datasets is important for research and education purposes – and should (somehow) be possible and allowed for ‘informal’ datasets as well as the more structured ones. This rarely happens and again perhaps reflects weaknesses in the teaching of data management and the research process.

·         Serious issues remain in relation to combining datasets – something that is the norm in academic research – resulting in a new, derived, dataset. How does a researcher easily get approval to use the different datasets, publish the results, ‘certify’ the resulting final dataset in some way (perhaps unknown provenance or quality of the original datasets). Are special rules or principles – and tools! – needed for the academic community, where this is a bigger problem/issue than for other typical geospatial data users? If so, how are these to be developed, agreed, promulgated, etc.?

 

 

Information Management ‘Best Practice’.

·         The informal systems we find used today by many researchers and teaching staff lack or do not enforce best data-management practice, which is a negative aspect in the long run, especially for the research community – and (perhaps) for future employers if those being ‘educated’ are not learning the skills needed in commercial use of geodata.

·         We should formally recognise that “there will be different or multiple ‘best practices’ for data access, use and re-use, due to the multitude of different types of datasets that could have a location attribute.

 

 

Data Access and Sharing.

·         While metadata creation and publishing metadata and datasets are important, ease of access to geodata should still be the first, overriding consideration. This is especially true for students, but also for basic research. This encompasses both ease of technical access and the policy issues, i.e. use/re-use rights, charging regimes, etc.

·         Access control needs further examination to identify and promote best practice. Both NERC Data Grid and EDINA services that provide access to legally protected resources, such as Digimap, use access control systems, as opposed to more general Digital Rights Management (DRM) protection techniques (which are not yet available, in most cases).

·         Much more work is needed in regard to implementing DRM in future services. DRM is still in its infancy in the geo world (release of OGC’s GeoRM Reference Model and ISO GeoREL standard for geo rights expression language have not been implemented – yet – in software tools).

·         Licensing control approaches often prevent researchers from sharing data, and this could prevent data from being placed into a repository and/or curated (see later discussion on preservation and curation). This is true already today, in circumstances where a dataset with legally restricted access (for example OS MasterMap data) is used in creating a derivative dataset from which the original (licensed) data could not realistically be ‘removed’ prior to placing the derivative dataset into a repository – and if it were, the ‘geo’ attribute of the dataset could be lost in any case.

·         Many of the data sharing issues impact all research data and not only those working with geodata – or even just the research sector.

·         How to develop and implement interoperable ‘secure access’ services needs more study and pilots to be developed, to determine (a) feasibility, (b) scalability, (c) tool sets (embedded in applications?), and (d) best practice.

·         Until the research community can develop and demonstrate that secure services – protecting IPR and other DRM issues – can be implemented interoperably.  Computational/data grids cannot become more widespread or ‘open’. Such grids will remain accessible only to those who (a) know where all the resources are located and (b) enter into prior access control agreements.]

 

 

Tools and Technology.

·         No one disagreed with the observations arising from the 15 May workshop, as listed in Annex 2 to this report. However, it was generally agreed that much work was still needed – and was on-going – in relation to practical implementation for many of tools and technologies presented in these observations.

·         In regard to data mining and discovery, perhaps what is needed is an “intelligent federation of catalogues”, which raises standards and implementation issues separate from those of creating an intelligent catalogue in the first place.

·         More study is needed on the issues – technical and legal – relating to remote processing of data. (Some of the issues for remote access and processing are already covered in other themes within this report).  Such study/research also needs to look into the future, for example at current thinking on how new web services (Web 2.0) will develop and evolve.

·         Regarding web services, “someone needs to stand up data access services in the first place and/or expand existing ones”. This is about JISC being willing to fund the capacity of data centres to support production web services. The technology has been proven in a number of projects. Perhaps it is a question of gathering the evidence to make the case, but this may be a chicken and egg situation – people want the services to find out if they can benefit form them, but JISC won’t fund the services until there is evidence that there is a need.

·         In regard  to services (web or otherwise), open source principles should be encouraged and promoted, via use of open source tools, although this does raise the issue of long-term support for such tools.  Yet the open source community, which is global, has already demonstrated that it can provide support for major parts of the information infrastructure, e.g. Apache servers on the internet (49.8% in August 2008, still outnumber Microsoft  at 34.9%) and governments around the world are issuing edicts that open source software will be used in government departments if it exists and meets user requirements. One only has to look at the success of OpenOffice to see what can be accomplished (See also ‘Food for Thought’ in Annex 5 to this report).

·         Regarding sensor networks, it was observed that there are a huge number of sensor-based projects underway, spanning many different types of sensors, configurations, capabilities, etc. Because these exist within quite disparate disciplines, yet may have common requirements, who has the lead on providing a strategy for developing sensor network interoperability or interoperability of data gathered by disparate sensor networks. Further study/research and/or a pilot project were recommended looking into the issue surrounding sensor networks.

·         Regarding portals, it was proposed that the academic community should facilitate creation of “virtual worlds”, defined as personalised data spaces, data services, etc. This needs both more study (as to what is feasible today and could be feasible in the near future) and/or a pilot project to explore the possibilities and value of ‘virtual worlds’ for researchers.

·         How can we make it easier to create virtual communities (as opposed to personalised ‘virtual worlds’?) who share a common interest, which may come and go, i.e. do not need ‘permanent’ infrastructure? What can be learned from existing social networking practice, creation of virtual organisations, ‘community creation’ tools and best practice?

·         The exciting possibilities of social networking tools for the geospatial community were discussed.  The idea of a geoFacebook was proposed that allowed users to visualise geospatial datasets of others in the network using OGC clients within the social networking environment, upload their own datasets and apply their own tags (informal metadata).  This would be an interesting future project.

 

 

Infrastructure.

·         There is already much information infrastructure in and across the academic community. This needs to be identified and related to SDI needs or components.

·         There are other infrastructures in existence in the community that need to be included in SDI – somehow.

·         Information infrastructures – and SDIs – change over time – we are seeing an evolutionary process – but the main issues remain those of standards and governance. Both are needed if important datasets are to be preserved, made discoverable, accessible and usable.

·         Do we need some sort of academic sector ‘rules’ for participating in a national SDI?

·         Generally agreed that separate rules regarding access and usage rights may be needed, especially for public sector information.

·         We should not worry about an ASDI “fitting into some national SDI” since the NSDI will inherently become the collection or federation of other, existing SDIs – thematic or sectoral – and thus incorporate the ASDI, providing we use the same standards.

·         It is good to have high-level agreement on how beneficial it is to share datasets, as principles of ‘best practice’, but the difficulty starts “at the coal face”. A number of issues arise, for different types of datasets, both technical and policy oriented, including privacy protection (e.g. for medical datasets), IPR (copyright and database protection), charging regimes and ‘best practice’, etc.

·         The main issue with ‘federated SDIs’ is to find (and implement) better ways to make existing or developing (sectoral, thematic) SDIs to be able to work together (‘interoperate’). Equally important is to find ways for SDIs to be able to interoperate with non-geospatial information infrastructures.

·         We may never see ‘federated SDIs’ - because of the many differences in the wide variety of datasets produced by, or of value to, the academic community, for education and research, across quite different disciplines - but we still need to be able to locate and access potentially interoperable data sets. From the spatial ‘interoperability’ viewpoint, this implies some form of standardisation on recording location attribute data/attributes within all types of datasets. [“A medical dataset is never going to fully follow the geospatial standards set by ISO or OGC, because most of the data in that dataset is non-spatial – only the location attribute – typically an address, place name or post code – is ‘spatial’ and should (somehow) follow an agreed standard. Then the medical/health data could be located and spatially analysed if someone wanted to.”]

·         An ‘Onion Model’ of SDIs was proposed to accommodate the wide variety of requirements within SDIs generally but particularly in the requirements for the academic sector.  This model would have progressively more formal layers of standards requirements, metadata and governance progressing into the onion.   The outer layer would include a very minimal, informal set of metadata requirements, mostly consisting of informal user tagging and very limited requirements for standards and protocols (for example, perhaps using HTTP).  The second layer would include still very simple metadata, but incorporate more structured but still minimal standards like those used by Google.  A third layer might use simple metadata standards like Dublin Core and perhaps a simple requirement to use OGC standards (or equivalent non-spatial).  Finally, the fourth, innermost layer would include the current ‘best practice’ in regard to formal standards for metadata and data specification at a ‘generic SDI’ level, including specific metadata standards and profiles for OGC web services that specify particular data models (for example, INSPIRE).

·         A study should be carried out looking specifically at the relationship between the perceived special needs of an academic SDI and the INSPIRE Directive’s implementation within the UK. For example, what datasets must be INSPIRE compliant (according to INSPIRE Directive) and what will be mandated within the UK’s response to INSPIRE, during implementation? [This supports point 3 (c) in Annex 2, a recommendation from the 15 May workshop: “Potential impact of INSPIRE technical Implementing Rules should be monitored on an on-going basis over the next two to three years.”

·         Data ‘interoperability’ is being confused with ‘system’  or ‘service’ interoperability. Data can be shared between interoperable services if the attached metadata (which does need to be standardised) can be used to create a ‘cross walk’, ‘look up’ or similar transformation service.

·         Where data does use standards (either for metadata or data specification), then these should be well accepted (preferably international) standards.

·         How best to access the whole range of geospatial datasets and tools (remote services, models, knowledge bases) under a single infrastructure?

 

 

Preservation, Digital Curation, Archiving.

 

 

·         With regard to earlier mention of citing of geo-datasets, It was reported that there is now an on-line repository/journal where datasets can be published/cited. There is also an extensive discussion of this topic at the Digital Curation Blog from August of this year at:  http://digitalcuration.blogspot.com/2008_08_01_archive.html. Various entries in this extensive blog report on developments in Current Research Information Systems (CRIS) in USA and Europe, including a European organisation – EuroCRIS -  that “has generated several versions of a data model and interchange standard (which it describes as the EC-recommended standard CERIF: Common European Research Information Format), of which the current public version is known as CERIF 2006 V1.1. (see http://www.eurocris.org/cerif/cerif-releases/cerif-2006/). [Perhaps this is something that needs further investigation with regard specifically to geospatial data – if this has not already been done.]

·         There is an upcoming workshop “Archiving 2008 - Preserving and Enabling Permanent Access to Cartographic Cultural Heritage - Workshop on Archiving in Digital Cartography and Geoinformation” - Berlin, Germany, 4-5 December, 2008. (see http://www.codata-germany.org/Archiving_2008/index.shtml).

·         Preservation and curation (archiving) are important for a number of reasons, including proving the provenance of original datasets later used in projects creating derived datasets and to be able to replicate prior experiments/research work.

·         As well as adopting and promulgating agreed formats for official data depositories, ways needed to be found to incentivize researchers to proactively put their research datasets into recognized repositories, where these exist. This raised the question of whether other or new repositories needed to be created, on an official basis, and who would look after these on a long term, sustainable basis.

·         Recommendation emerging was that a formal policy for archiving geodata is required, “defining what and how (to) archive GI and make it available in the future” (mirroring the recommendation from the 15 May workshop). However, more study was needed into what this policy should be, focusing on the ‘location attribute’ that may be only one of scores of attributes in a data record, but which is necessary to be able to search for resources spatially.

·         A prior study (GRADE) found that academic users wanted a central repository for spatial data, but JISC believed that it was up to institutions to implement such repositories. This raises issues of long-term sustainability. There is no point in creating a repository that does not have assured longevity – otherwise the main purpose of preservation is lost.

 

 

Other Issues and Observations.

 

 

·         The COMPASS project, started in December 2007 and runing through to end March 2009. Its goal is to “build an infrastructure that supports scientists in the discovery, access and use of scientific resources (including journal articles, data sets, web services and scientific models).” One of the proposed outputs is “a technical design for the infrastructure.”  (see http://compass.edina.ac.uk/tiki-index.php) Outputs from this project could provide important technical components for the ASDI. The pilot sector is the coastal marine environment.

·         It is important to be aware of the potential of geo-referencing other types of resources e.g. photographs, documents, video etc. In addition to indexing by 'what a resource is about' (subject content), we ought to be able to index cultural,  statistical and most other information by 'where its about' (and ‘when it is about’).

 

 

Making it Happen – Part 2

·         “Making it Happen – Part 2” topics received less coverage in the 29 August workshop due to time restrictions, however, it was agreed that most of the observations presented from 15 May had either already been covered in prior discussion (outlined above) and/or were non-controversial, e.g. regarding standards, interoperability and data charging (access).

·         The main discussion covered who should be responsible for representing the academic ‘geo’ community in regard to national (UK Location Strategy) and European (INSPIRE and its UK implementation rules) SDI issues. The feeling was that JISC GWG should be the speaker for geo activities

·         In regard to ‘finding a champion’, it was also suggested that ‘climate change’ could be a driver for added funding of projects relevant to the geo community. No champion seemed to have emerged to date, equivalent to the three EU Commissioners who backed INSPIRE in 2002.

 

 

General Recommendations.

·         Whether talking about data, sharing, tools, services, etc. – the community needs a long-term strategy for all of this, rather than just the output from another study or ‘best practice’ pilot – otherwise the community will not buy in to the need to adopt whatever best practice, tools or standards are put forward, since these will have no claim to permanency. This raises the question of: (a) whose responsibility is it to develop this strategy, (b) who would be required to, or would choose to, adopt the strategy and under what set of legislation, rules or incentives, and (c) would the adopters receive additional support (financial and human resources) in implementing the tools needed to implement the strategy. As we see with the INSPIRE Directive and the impact it is having and will continue to have on national SDI implementation, developing the standards and best practice is a lengthy and expensive business, and there has been little or no ‘new money’ made available either by the EU or by member states in implementing their own NSDIs.

§         Any long term strategy should incorporate a plan to combine existing formal approaches to SDI with the informal, GYM approaches that are popularising geospatial information globally.  Any strategy that does not consider these aspects is likely to be of limited success and applicability, and SDIs around the world will be looking for ways to engage with and take advantage of the resources and market intelligence of GYM.  Thus there is an opportunity for the UK academic sector to lead the way in developing such solutions to harness the power of both approaches.

 

 

 

 


Prior Recommendations of Interest.

 

 

The list of recommendations in the “JISC Geospatial Working Group Vision: Executive Summary” from January 2007 should be revisited, as many of the key points raised in the discussions of both 15 May and 29 August were already highlighted among the 24 recommendations in the “Consolidated list of Recommendations” in that report more than 18 months ago. The question arises as to why more action has not taken place in regard to these recommendations – or at least what is the state of play in regard to the recommendations? Those that are most allied to points raised in the workshops include (verbatim):

(1) We need to view our services and collections as part of a strategic national spatial data infrastructure. This has implications for the functioning and delivery of all JISC services and will have a marked impact on their potential.

(4) There is need for a gap analysis between what the JISC envisages for the E-framework and what the GI community envisage for a spatial data infrastructure. This gap analysis will aid mutual comprehension.

(10) Geospatial data standards need to be thoroughly integrated into the standards development and advisory services that we support and recommend.

(11) We should aim to ensure that all of our geospatial services are inter-operable and we should strive to try to link our geospatial services with other JISC content providers to enrich these resources.

(12) Digital rights management tools have to keep pace with changes in the legal framework that underpins our collections and the expectations of what should be possible in a service oriented approach to information provision.

(13) Rights pertaining to derived data need to be more thoroughly understood to encourage the sharing and re-deployment of existing resources.

(14) Extensions to the copyright licence should be implemented in full to meet the widest needs of the community.

(16) Mechanisms to allow researchers to share data and maps as by-products of research should be supported. In particular there should be discussion between the Research Councils and the JISC on access to specialist data and maps produced by the research community.

(18) We need to develop a practical action plan that will embed and disclose JISC’s Geospatial data collections to the Grid computing community.

(22) Clearer guidance is needed to ensure that data centres and institutions provide complementary and consistent levels of support to users of geospatial data sets.

The full list of recommendations is contained in Annex 6.

 

 

 

 

 

 

 

 

 

 

 

 

 

 


 

 

Annex 1. “Making It Happen – Introduction”

 

 

Academic SDI Technical Architecture Workshop

29 August 2009

 

 

Introduction

 

 

-          Background to both workshops

o        Why is the consultation being undertaken?

-          Format of the day

o        not a brainstorming session - an open, active dialogue - everyone participates

o        no break out groups

o        a.m. – work through prepared material & set of questions relating to prior observations

o        p.m. – open (directed?) discussion on issues arising from the a.m. discussion

o        draft summary prepared at end of the day

 

 

Workshop Objectives

-          Establish the main elements of the technical architecture of the UK academic sector SDI.

-          Identify key steps necessary to progress the UK academic SDI.

 

 

Expected Outputs

-          A series of specific recommendations to JISC on progressing the UK academic SDI.

-          This "roadmap" will be presented in the context of the EDINA e-Framework study.

 

 

Background

-          First consultation meeting on 25 May 2008

-          29 participants - brainstorming format – observations but no conclusions.

-          Looked at the “who, what, why, when” of geodata use in higher education now and into the future.

-          Group observations relating to “Data and Content” and “Tools and Technology” are relevant for our meeting today.

 

 

Today’s meeting

-          Takes an ‘architecture’ focus – input from technical specialists and pedagogical practitioners.

-          Looks at the “how” – what is needed in the ‘academic sector SDI’.

-          Policy issues, such as access, use and re-use of geodata - due to the different ways that geodata is used in academic institutions?

o        pedagogy,

o        ‘academic’ research (by students/staff),

o        ‘paid’ research – for outside agencies on profit making basis

 

 

Why are we here?

-           “Making it happen” follow-on from 15 May workshop, i.e. the ‘how’.

-          Deliverable to JISC, including recommendations for future technical aspects of geodata service delivery, building on the 15 May workshop results and today’s dialogue.

 

 

Planned follow-up actions

-          Circulate a draft report of the issues, discussion and any draft recommendations to a wider group for greater legitimacy;

-          Open the wiki to the wider education community and advertise for input/comments;

-          Write a more accessible final report for JISC and the academic community.

 

 

Practicalities - what can we do in one 5-6 hour day?

-          Propose a framework for categorising issues, e.g. [Technical] – [Data] – [Policy], based on results of the 15 May workshop.

-          Make recommendations only in those areas that are considered to be ‘unique’ to an ‘academic sector’ SDI?

-          Build on what went before (i.e. 15 May meeting)

 

 

Definition time – what are we talking about?

-          What do we mean by ‘SDI’?

-          [INSPIRE Directive definition: “...metadata, spatial data sets and spatial data services; network services and technologies; agreements on sharing, access and use; and coordination and monitoring mechanisms, processes and procedures ...”

 

 

What do we mean by “academic SDI”?

-          What is ‘unique’ about the academic sector compared to other geodata use sectors?

-          Geodata use is predominantly pedagogical and research oriented (with little commercial activity – but not zero – adds a complication on access, use, re-use)

Main users are non-commercial, non-governmental, on a learning curve

-          Preservation of geodata is (perhaps) more important than in commercial sector(s)

-          More?

Access, use, re-use – both technical and policy elements

-          If the main thing that is unique about ASDI is the access, use and re-use policy issue, then does this alone require discussion of a separate ‘SDI’ – or rather an “academic data access and sharing policy” that is specific to an academic environment?

-          We live in a dual-business-model geo industry world (even ‘multiple business model’) -  is academia any different in that sense?

-          Different enough to be considered ‘unique’?

 

 

Relationship with other SDI initiatives.

-          Current UK regional SDIs (Wales, N.I., Scotland “strategy”)

-          Thematic SDIs (marine SDI – IACMST’s MDIP – Marine Data and Information Partnership)

-          UK Location Strategy

-          INSPIRE (UK implementation connotations, specific to academia v. government authorities)

-          Is there an ‘academic SDI’ focus anywhere else, i.e. USA (NCGIA, URISA), Canada, Australia/NZ, elsewhere?

-          GSDI, GEOSS, UNSDI

o        both GSDI and UNSDI are currently debating what “SDI” means in practice

 

 

‘Academic’ SDI versus Generic (National) SDI

 

 

-          Can an “academic SDI” stand separately from National (generic) SDI?

o        If not, then how does one distinguish the ‘academic’ SDI elements from generic elements?

o        Same question applies to other “thematic” or “sectoral” SDIs, e.g. marine/coastal, health, meteorology?

 

 

Technical Issues

-          What are the “technical issues” and who is involved in resolving these?

-          Does academia play an important role in regard to technical issues?

-          Or are these driven mainly by ‘big business’ and/or government needs?

 

 

Standards as one key technical issue

-          Standards development organisations (SDOs), e.g. OGC, ISO, CEN, set the standards.

-          How do you ‘standardize’ when standards are a moving target, i.e. early OGC specs and some ISO 19xxx standards are already under revision or extension; new ISO work items are added regularly (ISO’s GeoREL draft standard was one of the latest additions to TC 211’s work, based on OGC’s earlier GeoDRM Reference Model work).

-          Vendors (both geospatial tools and data) introduce interoperability standards into their product lines as a matter of survival - this will continue into the future (in the case of software, partly due to the FOSS movement).

-          Content providers use standards if it is in their best interest – which it usually is, today – yes?

-          Initiatives like INSPIRE will set minimum interoperability standards – and very many datasets will not be required to comply with INSPIRE specifications.

 

 

Geodata Preservation

-          Is there a special role in regard to geodata preservation, especially digital GI, in link with digital library initiatives and technologies?

o        Datasets resulting from research disappearing once the research project is completed – and funding for the server disappears (!)

o        Should this be a concern for NSDI as well, especially for government departments? Do they have the technology and skill to do this.

 

 


Annex 2. “Making it happen – Part 1”

 

 

SDI Tech Architecture workshop

29 August 2009

 

 

Workshop Objective 1:

 

 

·         Establish the main elements of the technical architecture of the UK academic sector SDI.

 

 

·         Build on results of 15 May workshop

 

 

·         There is significant overlap between Data and Content Enablers and the Tools and Technology Enablers

  

Data issues    Policy issues   Technical issues

 

 

Data and Content

 

 

1.        Metadata and semantics will be important.

a.     Metadata must provide information on:

                     i.            data quality

                   ii.            validation

                  iii.            publication

                iv.            provenance

b.     There must be mechanisms in place for automatic data documentation, i.e. metadata creation

c.     Metadata and data content should be semantically enriched.

 

2.      Types of data

a.     Difficult to specify core datasets because they vary between disciplines. Multi-disciplinary focus groups must determine:

i)        What data are required by all (the “Framework” datasets)

ii)       A list of datasets core to each discipline/domain

b.     Global Elevation data is increasingly important and collection and distribution should be funded.

c.     Multi-scale GI is important and should continue to be provided (e.g. not just OS MasterMap but smaller scale, generalised products too).

d.     There is a requirement for data across time (4D data), but because temporal data is characterised by change, it is not always possible to come up with a definitive list.

e.     Role for community generated data - collection and distribution should be encouraged.

f.        Definitive addresses are important and should include UPRNs (Unique Property Reference Number) which comply with appropriate standards (BS7666, ISO…..)

g.     Developing data about and for virtual environments should be encouraged and incorporated in GI related research and teaching.

h.      Databases must be considered as well as datasets, in terms of access, sharing, deposit and archiving.

i.         Derived data is increasingly important but onward use is restricted by IPR issues, which must be addressed.

 

3.      INSPIRE

a.     All data and metadata issues, especially access, relate to INSPIRE and need realistic hierarchical prioritisation.

b.     Investigations into the relevance and application of INSPIRE to the academic community must be undertaken.

c.     Potential impact of INSPIRE technical Implementing Rules should be monitored on an on-going basis over the next two to three years.

  

4.      Interoperability

a.     Data must be interoperable and comply with appropriate standards. 

 

5.      Standards

a.     Data that is available needs to comply with all appropriate standards.

 

6.      Data storage and distribution

a.     Methods to allow easy and efficient depositing of data (in repositories, archives) are essential and must include:

         i.            Simple methods of format conversion, and acceptance of data in various formats (with appropriate metadata),

       ii.            Tools for creation of metadata on data deposit,

      iii.            Spatial searching.

b.     Policy and standards are required which define:

         i.            formats suitable for depositing,

        ii.            how data deposit will be enforced / encouraged,

      iii.            how and where databases should be deposited / archived.

c.     A policy for GI Archiving is required, which defines what and how we archive GI and make it available in the future 

 

7.      Digital Rights Management (DRM)

a.     Methods of DRM must be established, tried and tested to ensure they facilitate the use of GI, rather than stifle or complicate it.

b.     Wherever possible, the need for DRM should be avoided, to encourage openness and sharing of data.

c.     GRADE project results – ever taken forward/further?

 

9.      Real data sharing

a.     Data sharing should be encouraged but there are many barriers (IPR, DRM, lack of suitable on-line facilities, lack of metadata, non-interoperability), which are addressed in other points in this section.

 

 

b.     How to overcome these barriers must be investigated. [How, by whom, who pays, a long-term issue]

 

 

c.     Suitable mechanisms [and policies] for sharing should be developed and encouraged (repositories, networking communities etc) 

 

10.  Relevance of non-geospatial data sources

a.     Academic GI Policy and strategic development must also think about the non-spatial data as connected to the spatial.

b.     Many datasets, collections and services are not identified as “geographical” but include geographical information (e.g. place names) in their associated metadata and the geo-referencing of these resources should be championed.

 

 

 

 

Tools and Technology

  

1.    Data Mining and Discovery

Better tools to enable the discovery of data need to be in place, such as a Digital Data Broker, which:

a.     includes an intelligent catalogue of data that uses semantics, ontology and  data dictionaries to:

                     i.            provide a measure of appropriateness of the data for the users’ task,

                   ii.            provide mechanisms for self selecting data,

                  iii.            allow some reasoning behind the discovery and use of data.

b.     minimises the possibility of misuse of spatial data by non-geospatial experts by providing simple but quality metadata ensuring provenance. [fitness for purpose?]

c.     minimises the possibility of  incorrect references in reports by providing clear referencing/citation information. 

 

2.    Tools for remote processing of data

The choice of desktop tools available is diminishing [yes?]. On-line tools will be available which allow academics to remotely process data:

a.     A user can send tools to the data (or vice versa) along with instructions about what processing is required.

b.     Could be based on /similar to Web2.0 tools (e.g. Google SketchUp, Google Docs)

However:

c.     Issues of bandwidth must  be addressed.

d.     IPR, copyright, legal and ethical issues must be addressed in relevant policy and legal frameworks.

e.     Data integrity must be preserved.

  

3.    (Web) Services

a.     A standard set of available geo-processing services will always be available including:

                     i.            geoprocessing

                   ii.            ii.      visualisation

                  iii.            spatial analysis

                iv.            statistical analysis

                 v.            “Shimmer Services” e.g. “format shim”, “co-ordinate shim”. Online services so no need to roll out software across campus which takes a long time (See http://intranet.cs.man.ac.uk/img/shimmer/index.php)

b.     Services will be available via Grid services if and when  appropriate.

c.     Services will be open source.

d.     Services will provide a solid basis for improvemen,t e.g. move from 2D to 3D (to 4D?).

e.     The methods of payment and contracting work with web services will be standardised and clearly defined (e.g. a pay per kb model?). [multiple models will emerge]

  

4.    Hand held mobile devices [and convergence of mobile device functionality]

a.     Such devices will be widely available and will include the following features:

                                 i.            portable, light

                               ii.            PCMIA

                              iii.            wireless

                            iv.            MEMS

                             v.            GPS-enabled

                            vi.            compass

                          vii.            multi-media recording and playback (audio/video)

                         viii.            wearable (e.g. like a watch)

                             ix.            rugged (for field work)

 

 

b.     There will be a common open source mobile platform. [maybe? but even if there is more than one, as long as they are open, does it matter? you can’t stifle competition – or innovation]

 

 

c.     They will be easy to use. [need voice control and input plus the software and processing power to handle that] [Microsoft Corp. released its first voice-controlled software for Windows Mobile(tm)-based Pocket PC and Pocket PC Phone Edition in November 2003]

 

5.    Sensor Networks

 

 

a.     Methods and tools [and policies?] for accessing sensor data must be established.

 

 

b.     There must be a set of computer standards for the collection, retrieval and application of sensor data and for the development of associated tools. [ISO 19130 + LOTS of ISO standards relating to different types of sensors]

 

 

c.     There should be a fully distributed sensor data network. [But there can be many different types of ‘sensor’ for many different purposes used in many different disciplines – so does it make sense to talk about “a fully distributed sensor data network”?]

 

6.      Tools / Portals

 

 

Any tools and portals developed should embrace the following to maximise the sense of telepresence a user experiences:

a.     Virtual world technologies

b.     Social networking technologies 

 

7.    What have we/they missed?

 

 

 

 


Annex 3. “Making it happen – Part 2”

 

 

SDI Tech Architecture workshop

29 August 2009

 

 

Workshop Objective 2:

 

 

- Identify key steps necessary to progress the UK academic SDI.

 

 

Summary of 15 May Workshop Session output

 

Standards.

 

 

·         Standards for data interchange - much wider field than at present.  Lots of people would dearly love to have systems and tools which would allow them to work in a particular way

 

 

Interoperability.

 

 

·         Data must be interoperable to make it as widely usable as possible AND tools to make it easy to use that data

 

 

Data charging.

 

 

·         Accessible also a requirement. Must therefore get away from charging for data. Charging means that you switch off at least 50% of potential users. Charging too early in the process you lose people. 

·         Non traditional disciplines massively underfunded. Therefore never consider purchasing anything or going in that direction. Plethora of stuff in US because they don’t charge.

 

 

How and Where to make the case.

 

 

·         Might be argued that GI users don’t make their case strong enough in the right places, but is this the case in the UK?. Particle physics – tons of money. But they are data generators not data users. We tend to stick to data collected by European agencies.

 

 

Paper to politicians.

 

 

·         Things are loosening up cf. 1998 prediction.  Shouldn’t support that policy. Good time politically to seek to influence data provision.  Minister needs to read the report on his desk!

·         Influencers are UK and European governments.

 

 

Non-OS options.

 

 

·         Options for developing resources outwith government agencies which should be discussed.  Many agencies would love to release themselves from OS if they were able to do so.

 

 

·         DMU – digitally mapping the whole of Leicester, range of different data types. Looking for interactive end of this to walk around city with tablets etc to interact with these data.  Would be v interested in Claire’s stuff.  Data available for free. Members of the public take photos which can be uploaded to the system for free.  Base data for city has nothing to do with OS. Lancaster University creating own dataset by sending students out with GPS units etc.  Institutions have created their own datasets for teaching, exempt from OS copyright.  BUT is this scalable – usually just an interesting exercise into what’s possible.  Open Street Map – good but all linear, problems with polygons.  Interesting licensing issues.  Wikmapping also quite good.

 

“Tools not Toys” - raise awareness of mobile technologies in other disciplines.

 

 

·         Leading edge technology and scalability: Claire Jarvis. Has funding to do stuff, but how do you roll it out nationally and who funds it?

·         Tablets and PDAs, TomToms, etc., available for loan.  Test purposes. Limited number for teaching time, so may be clashes.

·         In the future, students may be using stuff on their own phones, e.g. GPS, digital compass.  Claire sees a need to talk to other disciplines about not seeing these things as toys. They are tools – getting people to take them seriously.  Often the case with newer technologies.

 

Find champion and leadership/management of process.

 

 

·         Mustn’t jump on a different bandwagon – should agree what we do. Problem is finding a group which champions/leads/funds/manages this process. 

 

INSPIRE - Need a central Academic Community representative.

 

 

·         Current concern about INSPIRE – nobody central to talk to about representing academia in INSPIRE. There isn’t one. Consultation will be piecemeal and will come through FOIs and random people.  Are these the right people to discuss INSPIRE with central government?  Alternative is that GWG has a much wider remit than currently.

 

 

·         Requirements of academics are so varied than consulting with them would take so long, be too rich and be unworkable.  BUT should we have the discussion to agree that academia shouldn’t be consulted on INSPIRE.  Could argue that this is an unresourced discussion and is therefore empty and pointless.  Some resource – Digimap, Census, MIMAS’ new data. 

 

 

JISC must take responsibility for dealing with INSPIRE.

 

 

·         Jump on the opportunities that JISC’s uncoordinated funding structures offer us. Main driver within JISC is not collections policy, not spatial policy, but rather relating to infrastructure and networking policy, leading to Library resources.  Closely linked to networking standards.

·         JISC’s responsibility is to deal with INSPIRE – generating conduit between content providers and users.

·         What about RC’s?  Standards for research institutions doesn’t fall under their radar. JISC is in the right position, but does it have that task or not?  

·         Standards different for mapping data, meteorological data, all different because working groups don’t match. 

·         Is JISC capable of coordinating across research councils on policy and strategy? JISC _ probably not. NO discussions at present.  The activity of JISC has been uncoordinated w.r.t. geospatial data. Activity spread across different parts of JISC.  Equally, JISC is parallel to RCs ... and therefore is overarching too.  

·         European countries are jealous of JISC, because there is no equivalent elsewhere.  (Useful if JISC started to undertake this role?) 

 

Need to have dialogue on cross-European access and sectoral silos of data

 

 

·         Talking about sectoral silos, build once, use many times, Dutch are starting their own Digimap, but no talk about cross-European access even within academic sector.

 

Need to convince JISC and Research Councils that we need to look at data infrastructure provision at an international level.

 

 

·         Some things are tightly knit internationally (eg met office data), but need to convince JISC, RCs and other bodies that we need to look at data infrastructure provision at an international level. RCs represented on GWG. Link is there.

·         NERC has data infrastructure

 

First to Market

 

 

·         Is there anything in being first to market?

·         If you have something that will do the job, just take it out there and see what happens. What’s wrong with this? 

 

Publish a paper (both academic and non-academic)

 

 

·         What will happen with today’s information? Will it be collated and distributed and used to move forward within JISC and the RCs.  Publish a paper?  Nowhere is a review like this made public.  EDINA happy for information to be used to write a review paper. Offer of assistance from audience welcome.

·         Useful to have an academic paper maybe? Makes it public … more food for thought.

·         Need to be able to lobby outside of academia.  Might assist this with a published paper. In order to be able to have something of which appropriate managers are aware.  Need to raise awareness within particular areas in and out of government. So that when requests for involvement are made, it rings bells.  Major part of icebreaker. Lack of awareness by bean counters.

·         Should be brave when this report is compiled. Should leave a lot of stuff out to avoid leaving a monolithic reservoir of detritus

 

 

 

 


 

 

 

 

Annex 4. Participants List

 

 

Participant

From/role

Contact e-mail

David Medyckyj-Scott

EDINA (Geo Data and Research Services, Team Manager)

D.Medyckyj-Scott@ed.ac.uk

Chris Higgins

EDINA (Workgroup Leader, Product Services and Development) SEE-GEO Project Manager and Workshop Organiser

chris.higging@ed.ac.uk

Roger Longhorn           

Info-Dynamics Research Assoc. (Facilitator)

Editor, GEO:connexion International magazine

ral@alum.mit.edu

roger@geoconnexion.com

 

 

 

 

 

 

Ben Butchart

EDINA (Senior Software Engineering, Workgroup Leader)

b.butchart@ed.ac.uk

Stuart Dunn

King’s College London

stuart.dunn@kcl.ac.uk

Phil James

Newcastle University

philip.james@ncl.ac.uk

Jeremy Morley

University College London

jmorley@ge.ucl.ac.uk

Richard Sinnott

National eScience Centre, Univ. of Glasgow

r.sinnott@nesc.gla.ac.uk

Kristin Stock

University of Nottingham

Kristin.Stock@nottingham.ac.uk

Andy Turner

University of Leeds

A.G.D.Turner@leeds.ac.uk

Andrew Woolf

NERC, Rutherford Appleton Labs

A.Woolf@rl.ac.uk

 

 


Annex 5. Food for Thought.

 

 

Tim O’Reilly on “cloud computing” and “open source”:

 

 

So here's my first piece of advice: if you care about open source for the cloud, build on services that are designed to be federated rather than centralized. Architecture trumps licensing any time.  But peer-to-peer architectures aren't as important as open standards and protocols. If services are required to interoperate, competition is preserved. Despite all Microsoft and Netscape's efforts to "own" the web during the browser wars, they failed because Apache held the line on open standards. This is why the Open Web Foundation, announced last week at OScon, is putting an important stake in the ground. It's not just open source software for the web that we need, but open standards that will ensure that dominant players still have to play nice. The "internet operating system" that I'm hoping to see evolve over the next few years will require developers to move away from thinking of their applications as endpoints, and more as re-usable components. For example, why does every application have to try to recreate its own social network? Shouldn't social networking be a system service?

...

Note that I said "reasonably open." Google Maps isn't open source by any means, but it was open enough (considerably more so than any preceding web mapping service) and so it became a key component of a whole generation of new applications that no longer needed to do their own mapping. A quick look at programmableweb.com shows Google Maps with about 90% share of mapping mashups. Google Maps is proprietary, but it is reusable. A key test of whether an API is open is whether it is used to enable services that are not hosted by the API provider, and are distributed across the web. Facebook's APIs enable applications on Facebook; Google Maps is a true programmable web subsystem.

 

 

from: http://blogs.oreilly.com/cgi-bin/mt/mt-search.cgi?blog_id=57&tag=cloud%20computing&limit=20


Annex 6. Recommendations from JISC Geospatial Working Group Vision, January 2007

 

 

1. We need to view our services and collections as part of a strategic national spatial data infrastructure. This has implications for the functioning and delivery of all JISC services and will have a marked impact on their potential.

 

 

2. We need to migrate the current mix of services

 

 

3. The development of the UK spatial data infrastructure needs to be cognisant of the E-framework, ensuring that the two are complementary not contradictory.

 

 

4. There is need for a gap analysis between what the JISC envisages for the E-framework and what the GI community envisage for a spatial data infrastructure. This gap analysis will aid mutual comprehension.

 

 

5. The key to achieving interoperability between the JISC e-framework and a wider spatial data infrastructure will be mutual comprehension and collaboration.

 

 

6. We should sustain and where possible increase investment in the current mix of services that are available to the community.

 

 

7. We should look to develop new facilities within existing services to meet changing user needs and expectation.

 

 

8. We should invest to join up the current range of services in order to get an even greater return on the investment.

 

 

9. We should review previous research and development projects in the sphere of geospatial technology and make a clear commitment to those it wishes to sustain as services.

 

 

10. Geospatial data standards need to be thoroughly integrated into the standards development and advisory services that we support and recommend.

 

 

11. We should aim to ensure that all of our geospatial services are inter-operable and we should strive to try to link our geospatial services with other JISC content providers to enrich these resources.

 

 

12. Digital rights management tools have to keep pace with changes in the legal framework that underpins our collections and the expectations of what should be possible in a service oriented approach to information provision.

 

 

13. Rights pertaining to derived data need to be more thoroughly understood to encourage the sharing and re-deployment of existing resources.

 

 

14. Extensions to the copyright licence should be implemented in full to meet the widest needs of the community.

 

 

15. We should respond flexibly to research needs, especially where this relates to the provision of data to researchers. This should be based on a detailed study of user needs and will require service models that avoid short-term subscription-based cost recovery.

 

 

16. Mechanisms to allow researchers to share data and maps as by-products of research should be supported. In particular there should be discussion between the Research Councils and the JISC on access to specialist data and maps produced by the research community.

 

 

17. The JISC community would benefit if Research Councils’ resources were made available to the wider community so that derived data products could be made freely and easily available to the whole academic community.

 

 

18. We need to develop a practical action plan that will embed and disclose JISC’s Geospatial data collections to the Grid computing community.

 

 

19. The JISC Committee for the Support of Research and the Geospatial Working Group should have a closer working relationship.

 

 

20. We need to work with CETLs and HEA in support of teachers and learners so that they can make the best possible use of geospatial resources.

 

 

21. Our ambitions to support teaching and learning with geospatial resources should be broadly based. Geography and cognate disciplines should not be the sole beneficiaries or vehicles for that work.

 

 

22. Clearer guidance is needed to ensure that data centres and institutions provide complementary and consistent levels of support to users of geospatial data sets.

 

 

23. JISC service staff should not be expected to fulfil needs that institutions have an obligation to meet. Institutions that divest themselves of in-house expertise to support the deployment of geospatial resources should be aware of the impact this will have on users. They must not be permitted to offload this expense to JISC

 

 

24. JISC should lobby the HE FCs regarding the provision of advice to institutions on the levels of institutional support and expertise they need to offer in order to participate in the various agreements and programmes.

 

 

 

 

Comments (1)

Peter Halls said

at 10:31 am on Oct 17, 2008

Firstly, the mundane: 4th paragraph, first line, delete 'on' from 'to on the needs'; Page 2, second bullet point, needs a full stop at the end.

Now the content. The second bullet point needs to refer also to licensing conditions and I'd be happier for the cost phrase to be something like 'available at a cost and with licensing conditions that reflect the national interest'.

Formal vs informal - 2nd bullet point: perhaps add a pointer to the risk to results quality due to a lack of understanding of the implications on the part of the user. 3rd bullet point: this is not 'especially in the physical sciences' and affects almost all disciplines. It would be better to word this 'research, where the data are complex or voluminous or spatial analysis,' 4th bullet point: the discipline list is too narrow: 'many students from all disciplines' would be better; add '(c) wider, especially non-UK, data availability'.

Information Management best practice, 2nd bullet point: I think that there are also discipline related differences too, eg medical and demographic.

Tools & Technology 8th bullet point: might a 'virtual community' offer a partial solution to access to support?

Infrastructure 10th bullet point: an uninformed reader may not realise the difference here between metadata for resource discovery (eg DC) and for archival storage (eg OGC/ISO) - this probably needs to be drawn out.

You don't have permission to comment on this page.