User Guidelines for GIS Data
Assumes installation of the relevant tools
Data Sources
See GIS/Data
Import Data
UUIDs
See: UUID
CSV
Locations can be imported using the normal import procedure:
- Prepopulate tasks.cfg:
gis,location,location.csv,location.xsl
- After install:
The CSV needs to have specific columns:
- WKT column if we have polygon info (or Lat and Lon for Points, if not)
- For L1, we need these columns: Country, L1 (& WKT)
- For L2, we need these columns: L1, L2 (& WKT) [Country can also be used to help separate duplicates]
- For L3, we need these columns: L2, L3 (& WKT) [L1 and Country can also be used to help separate duplicates]
- For L4, we need these columns: L3, L4 (& WKT) [L2, L1 and Country can also be used to help separate duplicates]
- For specific locations, we need these columns: Lx (for appropriate parent level of hierarchy) Name (& Lat/Lon) [L2, L1 and Country can also be used to help separate duplicates]
- Key/Value columns are used, if-present
- e.g.: "L1 KV:GADM" adds a record in gis_location_tag for every L1 record with a key of "GADM" & a value of whatever is specified in the cell.
- Population & Elevation columns are read, if-present
- Ensure that names are consistent between Levels
- The PROPER() spreadsheet function is useful to get the names in the correct format (then Paste as Text).
- The VLOOKUP() spreadsheet function is useful if the different levels of the hierarchy are in different sheets & linked via a code instead of the name (as we need):
- If the lookup sheet is called 'lookup' and the lookup table is in B2-C87 then lookup code in C2 to column 2 of the lookup table: =VLOOKUP(C2; lookup.B$2:C$87; 2; 0)
- To remove duplicate rows, can create a new column with
=IF(A1=A2;1;0)
, Paste Special, Sort & Delete 1s
Basic Hierarchy can often be found from Wikipedia (although currently there's no easy way to download this - a student project to enhance Wikipedia for this would be much appreciated! e.g. Using WikiData).
For the Polygon data, it is normal to get this from Shapefiles, such as GADM or UN CODS.
Shapefiles
Inspect the data using qGIS.
Use ogr2ogr to convert the data to CSV:
ogr2ogr -select ISO,NAME_1,NAME_2 -f CSV CSV USA_adm2.shp -lco GEOMETRY=AS_WKT ogr2ogr -f CSV CSV TM_WORLD_BORDERS-0.3.shp -lco GEOMETRY=AS_WKT ogr2ogr -f geojson TM_WORLD_BORDERS-0.3.json TM_WORLD_BORDERS-0.3.shp
If needing to reproject (e.g. for the Haiti Departements):
ogr2ogr -f CSV haiti_departments Haiti_departementes_edited_01132010.shp -s_srs EPSG:32618 -t_srs EPSG:4326 -lco GEOMETRY=AS_WKT
NB AS_WKT requires OGR v1.6+
If the data is Admin Boundaries then it can be imported into the gis_location table via http://host.domain/eden/gis/location/import
Otherwise the data can be imported into the gis_theme_data table via http://host.domain/eden/gis/theme_data/import
- you will first need to define the layer and activate in the config(s) of your choice
An alternative way to deal with Shapefiles is to upload to GeoServer & serve as WMS/WFS from there...
Dissolving Polygons (e.g. recreate L0 from L1s):
ogr2ogr output.shp input.shp -dialect sqlite -sql "SELECT ST_Union(geometry), dissolve_field FROM input GROUP BY dissolve_field"
GDB
ESRI's File Geodatabase format.
Can use this online service:
or use GDAL:
NB It isn't included in GDAL by default. Windows users can use osgeo4w to install the gdal-filegdb driver.
KML
Can convert a KML to CSV using the attached script: python KML2WKT.py <filename>.kml
- requires keytree
This can then be imported into Sahana by editing the column headers & using the Importer
qGIS can be used to convert this into a Shapefile (uses ogr2ogr so can also do using the CLI, if you prefer): give it column headers with 'WKT' for the WKT column name.
- This is the easiest way to load into PostGIS (using PGAdmin III's Shapefile Importer plugin) to allow GeoServer to serve as WMS
An alternate approach is to use this XSL:
Geonames
There is an import_geonames() function in S3GIS which downloads/unzips the country file (a TAB-separated list) from http://download.geonames.org/export/dump/
It should be run for the different levels of hierarchy that you wish to import (generally just the lowest level as Geonames just has Point data, so it's best to use other sources for the Polygons 1st, that way the Geonames importer can locate these Points within the correct Polygons of the hierarchy)
NB It takes some time to do this import! Pakistan imports 95000 locations!
Update: Geonames schema 2.2 supports parentADM(1-4): http://geonames.wordpress.com/2010/09/29/geonames-ontology-2-2/
- will be good for when we only have hierarchy, not polygons
- need to check whether much data has this populated though.
Python 2.5 doesn't support Zipfile.extract() & Zipfile.read() isn't unicode-safe. Until this is fixed, download the file manually 1st:
cd ~web2py/applications/eden/cache wget http://download.geonames.org/export/dump/PK.zip unzip PK.zip
In Web2py CLI:
gis.import_geonames('PK', 'L5') db.commit()
Alternate approach:
- Transform each line in this file into XML by regular expression:
^(\d*)\t([^\t]*)\t([^\t]*)\t([^\t]*)\t([0-9\.]*)\t([0-9\.]*)\t[^\t]*\t([A-Z]*).* into: <location> <id>$1</id> <name>$2</name> <asciiName>$3</asciiName> <localNames>$4</localNames> <lat>$5</lat> <lon>$6</lon> <featureClass>$7</featureClass> </location>
This can be done using an RE-capable editor (e.g. Kate), Perl or even Python. Note: Need to replace & with & and to remove any invalid characters
- Transform into S3XRC-XML using XSLT, stylesheet is available at
OpenStreetMap
WFS
It is possible to use the WFS Plugin to get data into qGIS & thence export into other formats.
May need to use a Custom CRS (in Settings menu - remember to Save!) such as:
- ESRI's Spherical Mercator (different to 900913) http://spatialreference.org/ref/esri/54004/proj4/
Can then go to the Layer Properties & Specify CRS to this User Defined Coordinate System.
Can then Save As and change the CRS to something like the standard WGS84.
Yahoo
- http://developer.yahoo.com/geo/geoplanet/data/
- Script to process http://pastie.org/1139680
Display Data
GeoServer
GeoServer can provide geospatial data in Raster (WMS) or Vector (WFS/KML) formats.
Once you have installed in Linux or Windows, then login:
- l: admin
- p: geoserver
PostGIS is recommended as the main data store.
- Create a dedicated DB for GIS data:
su postgres psql CREATE USER gis WITH PASSWORD 'GIS'; \q createdb -O gis gis psql \c gis CREATE EXTENSION postgis; GRANT CONNECT ON DATABASE gis to geoserver; GRANT SELECT ON gis_location to geoserver; GRANT SELECT ON geometry_columns to geoserver; GRANT SELECT ON spatial_ref_sys to geoserver;
- Allow Read access to GeoServer to the Sahana DB:
su postgres psql CREATE USER geoserver WITH PASSWORD 'geoserver'; \c sahana GRANT CONNECT ON DATABASE sahana to geoserver; GRANT SELECT ON gis_location to geoserver; GRANT SELECT ON geometry_columns to geoserver; GRANT SELECT ON spatial_ref_sys to geoserver; GRANT SELECT ON stats_demographic to geoserver; GRANT SELECT ON stats_demographic_data to geoserver; GRANT SELECT ON stats_demographic_aggregate to geoserver;
Configure:
- Admin Password
- Contact Details
- Disable Global Services (2.1+ only)
- WFS Details
- WMS Details inc the Limited SRS List - probably to just:
4326, 900913
- Disable the demo Layers & Layer Groups
- Add Workspace
- Add PostGIS Store
Import CSV
- Prepare CSV (no extraneous columns, must be labelled same as schema, WKT data in a column)
- Create the Schema in PostGIS:
CREATE TABLE my_table ( gid serial NOT NULL, name character varying(50), the_geom geometry, CONSTRAINT my_table_pkey PRIMARY KEY (gid), CONSTRAINT enforce_dims_the_geom CHECK (st_ndims(the_geom) = 2) ); CREATE INDEX my_table_the_geom_gist ON my_table USING gist (the_geom );
- Import the CSV
psql \c gis \copy my_table(name,the_geom) FROM 'my_csv.csv' DELIMITERS ',' CSV HEADER;
- Help was taken from: http://www.kevfoo.com/2012/01/Importing-CSV-to-PostGIS/
Import GeoJSON
Convert to Shapefile:
ogr2ogr -f 'ESRI Shapefile' NYC_1m_Boundary_Clipped_WGS84webmerc.shp NYC_1m_Boundary_Clipped_WGS84webmerc.geojson
Then proceed as-below:
Import Shapefiles
e.g. Country Outlines:
These can be loaded direct into GeoServer, however there will be better performance by importing into PostGIS:
(can also use pgAdmin III GUI's Shapefile loader on plugins menu)
su postgres shp2pgsql -s 4326 -I TM_WORLD_BORDERS-0.3.shp public.countries | psql -d gis
To reproject the data into 900913 for a slight performance advantage:
drop constraint srid; update table set geomcolumn=transform(geomcolumn,900913);
Configure GeoServer
Colours:
Zoom Levels (in Spherical Mercator):
Zoom Level(s) | MinScale | MaxScale |
---|---|---|
1 | 250000000 | n/a |
2 | 100000000 | 250000000 |
3 | 50000000 | 100000000 |
4 | 25000000 | 50000000 |
5 | 10000000 | 25000000 |
6 | 5000000 | 10000000 |
7 | 2500000 | 5000000 |
8 | 2000000 | 2500000 |
9 | 1000000 | 2000000 |
10 | 500000 | 1000000 |
11 | 250000 | 500000 |
12 | 100000 | 250000 |
13 | 50000 | 100000 |
14-22 | n/a | 50000 |
Configure GeoWebCache
The raw WMS server will be slow, so once you've chosen your style, then you should serve via GWC. This caches pre-rendered tiles & also does MetaTiling so that the WMS has less separate requests (at a cost of increased RAM requirements)
The version embedded within GeoServer is great for providing a zeroconfig of the common options, however there are cases where you need to define a layer manually:
- Want a Background Colour (bgcolor)
- Want to specify an alternate style (& you'd rather not republish the layer on the WMS)
- Want to render a set of layers into a single tileset (so that clients don't need to download them separately & merge locally)
Tips for optimal usage:
Example geowebcache.xml
in the GADM section:
WFS
If you are displaying a complex dataset at zoomed-out resolutions, then you will want to have simplified views.
e.g. Hospitals aren't shown at all at low zooms, are shown as Points at medium zooms & shown as Polygons at high zooms.
Scale-dependent styling using SLD in GeoServer:
Scale-dependent styling in OpenLayers:
Simplifying Polygons in PostGIS:
Showing the different layers at different zooms using GeoServer:
- http://docs.geoserver.org/stable/en/user/tutorials/feature-pregeneralized/feature-pregeneralized_tutorial.html
cd wget http://kent.dl.sourceforge.net/project/geoserver/GeoServer%20Extensions/2.1.0/geoserver-2.1.0-feature-pregeneralized-plugin.zip cd /var/lib/tomcat6/webapps/geoserver/WEB-INF/lib/ unzip ~/geoserver-2.1.0-feature-pregeneralized-plugin.zip /etc/init.d/tomcat6 restart
Add WMS Layer to Sahana Eden
- tbc
WMS Reprojection
- Have a remote WMS source that you want to access?
- Have a desire to keep OpenStreetMap/Google/Bing layers?
- WMS source server doesn't support the 900913 projection?
e.g. TRMM Rainfall Monitoring
Solution: MapProxy
Grid
We have a 'Coordinate Grid' Layer available by default.
Other options:
Administrative Areas
GADM
GADM is the best source of global Administrative Boundaries:
There are often better local sources for specific countries, although getting hold of these can be difficult. Note that some countries have boundaries which change frequently and so datasets can often be a little out of date.
To import into Sahana Eden's gis_location table (for consistency of naming/boundaries across basemap & dynamic data):
- Install latest Python GDAL bindings
- Open a web2py CLI:
python web2py.py -S eden -M
- Optionally, define a filter for which countries you wish to import data, e.g. for Asia-Pacific (without TL, as that will be imported from UN CODS):
countries = [ "AF", "AU", "BD", "BN", "CK", "CN", "FJ", "FM", "HK", "ID", "IN", "JP", "KH", "KI", "KP", "KR", "LA", "MH", "MM", "MN", "MV", "MY", "NP", "NZ", "PG", "PH", "PK", "PW", "SB", "SG", "SL", "TH", "TO", "TV", "TW", "VN", "VU", "WS"]
- Import:
gis.import_admin_areas(countries=countries)
This can then be served as separate WMS layers using GeoServer & GeoWebCache.
You can use GeoServer's SQLView feature.
SLD files are attached:
- L0 Base
- SQL View: SELECT id, name, area, the_geom FROM gis_location WHERE level='L0'
- L0 Overlay
- SQL View: SELECT id, name, area, the_geom FROM gis_location WHERE level='L0'
- L1 Overlay
- SQL View: SELECT id, name, area, the_geom FROM gis_location WHERE level='L1'
- L2 Overlay
- SQL View: SELECT id, name, area, the_geom FROM gis_location WHERE level='L2'
Example geowebcache.xml for GADM attached which provides 3 layers:
- L0 Base
- L0-L2 Base (merged)
- L0-L2 Overlay (merged)
cp geowebcache.xml /var/gis/geoserver_data/gwc /etc/init.d/tomcat6 restart
Population Density
GPWv3
Gridded Population of the World, version 3 (GPWv3) is the standard global dataset for both measured & projected population densities.
Download gl_gpwfe_pdens_10_wrk_25.zip from (requires registration):
mkdir /home/data/GPWv3 cd /home/data/GPWv3 unzip gl_gpwfe_pdens_10_wrk_25.zip /usr/local/bin/gdal_translate -of GTiff glfedens10/glds10ag/hdr.adf glds10ag.tif ln -s /home/data/GPWv3 /var/gis/geoserver_data/coverages/GPWv3
Add new GeoTIFF Store to GeoServer to serve as WMS:
URL: file:coverages/GPWv3/glds10ag.tif
Style:
GRUMPv1
Global Rural-Urban Mapping Project, version 1 (GRUMPv1) is a newer dataset which combines satellite with census data to locate people within settlements rather than just administrative areas:
US Census
For the US, data is available down to block level:
http://factfinder2.census.gov/faces/nav/jsf/pages/download_center.xhtml Dicennial Census 2010 SF1 100% Data
Convert data from NAD83 to WGS84 (e.g. using qGIS)
su postgres cd /data/Census2010BlockGroup shp2pgsql -s 4326 -I tl_2010_06037_bg10_WGS84.shp public.Census2010BlockGroup | psql -d gis psql \c gis ALTER TABLE census2010blockgroup ADD COLUMN population integer; ALTER TABLE census2010blockgroup ADD COLUMN population_density integer; \q exit w2p %autoindent pop_dict = {} input = os.path.join("/", "home", "data", "Census2010BlockGroup", "DEC_10_SF1_P1_with_ann.csv") inputFile = open(input, "r") header = 2 for line in inputFile: if header: header -= 1 continue parts = line.split(',', 6) geoid = parts[1] pop = int(parts[6].strip()) pop_dict[geoid] = pop inputFile.close() from __future__ import division db_string = "postgres://gis:GIS@localhost:5432/gis" db2 = DAL(db_string, migrate_enabled = False) table = db2.define_table("census2010blockgroup", Field("gid", "id"), Field("geoid10"), Field("aland10", "integer"), Field("population", "integer"), Field("population_density", "integer"), migrate=False) rows = db2().select(table.geoid10, table.aland10) for row in rows: data = {} geoid10 = row.geoid10 population = pop_dict.get(geoid10) if population: data["population"] = population aland10 = row.aland10 if aland10: area = aland10 / 2589988 population_density = population / area data["population_density"] = population_density db2(table.geoid10==geoid10).update(**data) db2.commit()
Can also have the Tract-level data for lower zoom (9-15) & create a Layer Group to serve the 2 layers with a style which shows only 1 level at each zoom
Topographic Maps
WMS
Topography can be rendered as WMS using e.g. GeoServer
Download the SRTMs in GeoTIFF format from ftp://xftp.jrc.it/pub/srtmV4/tiff
- get.sh can assist with this
- Beware this is a *lot* of data! (At least 60Gb for a global dataset)
- The GeoTIFFs are already well compressed
- Partial datasets are, of course, possible (@ToDo: A script to select an area & download just that area)
mkdir /tmp/SRTMv4 cd /tmp/SRTMv4 sh get.sh
Unzip the data into a folder called 'SRTMv4' in the GeoServer 'coverages' folder (or use a symlink):
cd /var/lib/tomcat6/webapps/geoserver/data/coverages mkdir SRTMv4 unzip -o /tmp/SRTMv4/\*.zip rm /var/lib/tomcat6/webapps/geoserver/data/coverages/SRTMv4/*.hdr rm /var/lib/tomcat6/webapps/geoserver/data/coverages/SRTMv4/*.tfw rm /var/lib/tomcat6/webapps/geoserver/data/coverages/SRTMv4/readme.txt
Give Tomcat permission to the folder:
chown tomcat6 /var/lib/tomcat6/webapps/geoserver/data/coverages/SRTMv4
Configure GeoServer by adding a new Store using the Image mosaicking plugin
- URL: file:coverages/SRTMv4
Publish (defaults OK)
Styling:
- srtm.sld 'official'
- topography.sld 'common usage'
Test out using the direct WMS URL:
Once happy then start using the GeoCache URL (but don't do this too early as otherwise you have to invalidate the cache to see your changes):
OpenStreetMap
Contours can be rendered using OSM tools:
- GIS/OpenStreetMap
- http://wiki.openstreetmap.org/wiki/Contours
- Alternate approach: SRTM 2 OSM
http://de.wikipedia.org/wiki/Benutzer:Alexrk2/SRTM-Reliefs
- suggests using GIMP's emboss filter! Azimuth = 135, height = 50, depth = 10
Old Printed Maps
Old Printed Maps can be 'Rectified' to be overlaid on the base maps:
OpenStreetMap
PostgreSQL management
PostGIS functions
- Centroids
SELECT name, iso2, asText(ST_Transform(ST_Centroid(the_geom), 4326)) AS centroid FROM countries;
Attachments (12)
-
srtm.sld
(2.5 KB
) - added by 13 years ago.
Official SRTM styling
-
topography.sld
(2.1 KB
) - added by 13 years ago.
Common Topography styling
-
get.sh
(48.9 KB
) - added by 13 years ago.
Script to download SRTMv4 GeoTIFFs
-
utf8.py
(1.0 KB
) - added by 13 years ago.
Encode GADM as UTF8
-
geowebcache.xml
(9.5 KB
) - added by 13 years ago.
Configuration for GADM
-
gadm_v1_lev0_base_sld.xml
(7.5 KB
) - added by 13 years ago.
SLD for GADM L0 Base Layer
-
gadm_v1_lev0_overlay_sld.xml
(12.5 KB
) - added by 13 years ago.
SLD for GADM L0 Overlay
-
gadm_v1_lev1_overlay_sld.xml
(12.0 KB
) - added by 13 years ago.
SLD for GADM L1 Overlay
-
gadm_v1_lev2_overlay_sld.xml
(12.4 KB
) - added by 13 years ago.
SLD for GADM L2 Overlay
-
populationDensity.sld
(1.4 KB
) - added by 13 years ago.
GPWv3 SLD
-
moveWKT.py
(1.0 KB
) - added by 11 years ago.
Move WKT column to end of CSV
- KML2WKT.py (1.9 KB ) - added by 11 years ago.
Download all attachments as: .zip