wiki:UserGuidelines/GIS/Data

Version 185 (modified by Fran Boon, 13 years ago) ( diff )

--

User Guidelines for GIS Data

Assumes installation of the relevant tools

Import Data

UUIDs

See: UUID

CSV

There is a function available in modules/s3/s3gis.py to import from CSV.

The CSV needs to have specific columns:

  • WKT column if we have polygon info (or Lat Lon for Points, if not)
  • For L1, we need these columns: ADM0_NAME, ADM1_NAME (& WKT)
  • For L2, we need these columns: ADM1_NAME, ADM2_NAME (& WKT) [ADM0_NAME can also be used to help separate duplicates]
  • For L3, we need these columns: ADM2_NAME, ADM3_NAME (& WKT) [ADM1_NAME can also be used to help separate duplicates]
  • CODE column is read, if-present
  • POPULATION column is read, if-present
  • Ensure that names are consistent between Levels
  • The PROPER() spreadsheet function is useful to get the names in the correct format (then Paste as Text).
  • The VLOOKUP() spreadsheet function is useful if the different levels of the hierarchy are in different sheets & linked via a code instead of the name (as we need)
  • To remove duplicate rows, can create a new column with =IF(A1=A2;1;0), Paste Special, Sort & Delete 1s

Basic Hierarchy can often be found from Wikipedia (although currently there's no easy way to download this - a student project to enhance Wikipedia for this would be much appreciated!).

For the Polygon data, it is normal to get this from Shapefiles (see below).

Example for Pakistan:

tablename = "gis_location"
table = db[tablename]
db.executesql("DROP INDEX name__idx on %s;" % tablename)
# L0
import csv
csv.field_size_limit(2**20 * 10)  # 10 megs
db.import_from_csv_file(open("L0.csv", "rb"))
db.commit()
# L1
gis.import_csv("pak_adm1.csv", check_duplicates=False)
db.commit()
# L2
db(table.name == "Baluchistan").update(name="Balochistan")
db(table.name == "Northern Areas").update(name="Gilgit Baltistan")
db(table.name == "N.W.F.P.").update(name="Khyber Pakhtunkhwa")
db(table.name == "F.A.T.A.").update(name="FATA")
db(table.name == "F.C.T.").update(name="Islamabad")
db(table.name == "Azad Kashmir").update(name="AJK")
gis.import_csv("pak_adm2.csv", check_duplicates=False)
db(table.name == "Sind").update(name="Sindh")
db(table.name == "AJK").update(name="Pakistan Administered Kashmir")
db(table.name == "FATA").update(name="Federally Administered Tribal Areas")
db((table.name == "Islamabad") & (table.level == "L1")).update(name="Federal Capital Territory")
db.commit()
# L3
db(table.name == "Jaccobabad").update(name="Jacobabad")
db(table.name == "Tando Allahyar").update(name="Tando Allah Yar")
db(table.name == "Qambar Shahdad kot").update(name="Qambar Shahdadkot")
gis.import_csv("pak_adm3.csv", check_duplicates=False)
db(table.name == "Islamabad").update(name="Islamabad Capital Territory")
db(table.name == "Tando Allah Yar").update(name="Tando Allahyar")
db(table.name == "Qambar Shahdadkot").update(name="Qambar Shahdad Kot")
db(table.name == "Leiah").update(name="Layyah")
db(table.name == "Leiah Tehsil").update(name="Layyah Tehsil")
db(table.name == "Kalur Kot Tehsil").update(name="Kallur Kot Tehsil")
db(table.name == "De-excluded Area").update(name="Tribal Area")
db(table.name == "De-excluded Area D.g Khan").update(name="Tribal Area")
db.commit()
# L4
db(table.name == "Noorpur Tehsil").update(name="Noorpur Thal Tehsil")
jhang = db((table.name == "Jhang") & (table.level==L2)).select(table.id, limitby=(0, 1)).first().id
table.insert(name="Ahmadpur Sial", parent=jhang, level="L3", url="http://en.wikipedia.org/wiki/Ahmedpur_Sial_Tehsil")
gis.import_csv("punjab_l4.csv", check_duplicates=False)
db.commit()
db(table.name == "Mirwah Taluka").update(name="Thari Mirwah Taluka")
db(table.name == "Shah Bunder Taluka").update(name="Shah Bandar Taluka")
badin = db((table.name == "Badin") & (table.level==L2)).select(table.id, limitby=(0, 1)).first().id
table.insert(name="Talhar", parent=badin, level="L3", url="http://en.wikipedia.org/wiki/Talhar")
jamshoro = db((table.name == "Jamshoro") & (table.level==L2)).select(table.id, limitby=(0, 1)).first().id
table.insert(name="Manjhand Taluka", parent=jamshoro, level="L3", url="http://en.wikipedia.org/wiki/Jamshoro_District")
gis.import_csv("sindh_l4.csv", check_duplicates=False)
db.commit()
db(table.name == "F.r Kala Dhaka").update(name="F.R. Kala Dhaka")
db(table.name == "Martoong Tehsil").update(name="Martung Tehsil")
db(table.name == "Takhat Nasrati Tehsil").update(name="Takht-e-Nasrati Tehsil")
dikhan = db((table.name == "D. I. Khan") & (table.level == "L2")).select(table.id, limitby=(0, 1)).first().id
table.insert(name="Daraban Tehsil", parent=dikhan, level="L3")
table.insert(name="Paroa Tehsil", parent=dikhan, level="L3")
lowerdir = db((table.name == "Lower Dir") & (table.level == "L2")).select(table.id, limitby=(0, 1)).first().id
table.insert(name="Adenzai", parent=lowerdir, level="L3")
table.insert(name="Balambat", parent=lowerdir, level="L3")
table.insert(name="Khal", parent=lowerdir, level="L3")
table.insert(name="Lal Qila", parent=lowerdir, level="L3")
table.insert(name="Munda", parent=lowerdir, level="L3")
table.insert(name="Samar Bagh", parent=lowerdir, level="L3")
table.insert(name="Tazagram", parent=lowerdir, level="L3")
table.insert(name="Timargara", parent=lowerdir, level="L3")
upperdir = db((table.name == "Upper Dir") & (table.level == "L2")).select(table.id, limitby=(0, 1)).first().id
table.insert(name="Barawal Tehsil", parent=upperdir, level="L3")
table.insert(name="Chapar Tehsil", parent=upperdir, level="L3")
table.insert(name="Dir Tehsil", parent=upperdir, level="L3")
table.insert(name="Khal Tehsil", parent=upperdir, level="L3")
table.insert(name="Kalkot Tehsil", parent=upperdir, level="L3")
table.insert(name="Wari Tehsil", parent=upperdir, level="L3")
gis.import_csv("khyber_l4.csv", check_duplicates=False)
db.commit()
# L5
gis.import_csv("punjab_l5.csv", check_duplicates=False)
gis.import_csv("sindh_l5.csv", check_duplicates=False)
gis.import_csv("khyber_l5.csv", check_duplicates=False)
db.commit()
field = "name"
db.executesql("CREATE INDEX %s__idx on %s(%s);" % (field, tablename, field))

Shapefiles

Inspect the data using qGIS.

Use ogr2ogr to convert the data to CSV:

ogr2ogr -f CSV CSV TM_WORLD_BORDERS-0.3.shp -lco GEOMETRY=AS_WKT
ogr2ogr -f geojson TM_WORLD_BORDERS-0.3.json TM_WORLD_BORDERS-0.3.shp

If needing to reproject (e.g. for the Haiti Departements):

ogr2ogr -f CSV haiti_departments Haiti_departementes_edited_01132010.shp -s_srs EPSG:32618 -t_srs EPSG:4326 -lco GEOMETRY=AS_WKT

NB AS_WKT requires OGR v1.6+

KML

Can convert a KML to CSV using the attached script: python KML2WKT.py <filename>.kml

This can then be imported into Sahana by editing the column headers & using gis.import_csv(<filename>.csv)

qGIS can be used to convert this into a Shapefile (uses ogr2ogr so can also do using the CLI, if you prefer): give it column headers with 'WKT' for the WKT column name.

  • This is the easiest way to load into PostGIS (using PGAdmin III's SHapefile Importer plugin) to allow GeoServer to serve as WMS

Geonames

There is an import_geonames() function in S3GIS which downloads/unzips the country file (a TAB-separated list) from http://download.geonames.org/export/dump/

It should be run for the different levels of hierarchy that you wish to import (generally just the lowest level as Geonames just has Point data, so it's best to use other sources for the Polygons 1st, that way the Geonames importer can locate these Points within the correct Polygons of the hierarchy)

NB It takes some time to do this import! Pakistan imports 95000 locations!

Update: Geonames schema 2.2 supports parentADM(1-4): http://geonames.wordpress.com/2010/09/29/geonames-ontology-2-2/

  • will be good for when we only have hierarchy, not polygons
  • need to check whether much data has this populated though.

Python 2.5 doesn't support Zipfile.extract() & Zipfile.read() isn't unicode-safe. Until this is fixed, download the file manually 1st:

cd ~web2py/applications/eden/cache
wget http://download.geonames.org/export/dump/PK.zip
unzip PK.zip

In Web2py CLI:

gis.import_geonames('PK', 'L5')
db.commit()

Alternate approach:

  1. Transform each line in this file into XML by regular expression:
    ^(\d*)\t([^\t]*)\t([^\t]*)\t([^\t]*)\t([0-9\.]*)\t([0-9\.]*)\t[^\t]*\t([A-Z]*).*
    
    into:
    
    <location>
           <id>$1</id>
           <name>$2</name>
           <asciiName>$3</asciiName>
           <localNames>$4</localNames>
           <lat>$5</lat>
           <lon>$6</lon>
           <featureClass>$7</featureClass>
    </location>
    
    

This can be done using an RE-capable editor (e.g. Kate), Perl or even Python. Note: Need to replace & with &amp; and to remove any invalid characters

  1. Transform into S3XRC-XML using XSLT, stylesheet is available at

OpenStreetMap

See below: UserGuidelinesGISData

WFS

It is possible to use the WFS Plugin to get data into qGIS & thence export into other formats.

May need to use a Custom CRS (in Settings menu - remember to Save!) such as:

Can then go to the Layer Properties & Specify CRS to this User Defined Coordinate System.

Can then Save As and change the CRS to something like the standard WGS84.

Yahoo

Display Data

GeoServer

GeoServer can provide geospatial data in Raster (WMS) or Vector (WFS/KML) formats.

Once you have installed in Linux or Windows, then login:

  • l: admin
  • p: geoserver

PostGIS is recommended as the main data store.

Configure:

Import Shapefiles

e.g. Country Outlines:

These can be loaded direct into GeoServer, however there will be better performance by importing into PostGIS:
(can also use pgAdmin III GUI's Shapefile loader on plugins menu)

su postgres
shp2pgsql -s 4326 -I TM_WORLD_BORDERS-0.3.shp public.countries | psql -d gis

To reproject the data into 900913 for a slight performance advantage:

drop constraint srid;
update table set geomcolumn=transform(geomcolumn,900913);

Configure GeoServer

Colours:

Zoom Levels (in Spherical Mercator):

Zoom Level(s) MinScale MaxScale
1 250000000 n/a
2 100000000 250000000
3 50000000 100000000
4 25000000 50000000
5 10000000 25000000
6 5000000 10000000
7 2500000 5000000
8 2000000 2500000
9 1000000 2000000
10 500000 1000000
11 250000 500000
12 100000 250000
13 50000 100000
14-22 n/a 50000

Configure GeoWebCache

The raw WMS server will be slow, so once you've chosen your style, then you should serve via GWC. This caches pre-rendered tiles & also does MetaTiling so that the WMS has less separate requests (at a cost of increased RAM requirements)

The version embedded within GeoServer is great for providing a zeroconfig of the common options, however there are cases where you need to define a layer manually:

  • Want a Background Colour (bgcolor)
  • Want to specify an alternate style (& you'd rather not republish the layer on the WMS)
  • Want to render a set of layers into a single tileset (so that clients don't need to download them separately & merge locally)

Example geowebcache.xml in the GADM section:

WFS

If you are displaying a complex dataset at zoomed-out resolutions, then you will want to have simplified views.

e.g. Hospitals aren't shown at all at low zooms, are shown as Points at medium zooms & shown as Polygons at high zooms.

Scale-dependent styling using SLD in GeoServer:

Scale-dependent styling in OpenLayers:

Simplifying Polygons in PostGIS:

Showing the different layers at different zooms using GeoServer:

Add WMS Layer to Sahana Eden

  • tbc

WMS Reprojection

  • Have a remote WMS source that you want to access?
  • Have a desire to keep OpenStreetMap/Google/Bing layers?
  • WMS source server doesn't support the 900913 projection?

e.g. TRMM Rainfall Monitoring

Solution: MapProxy

Grid

We have a 'Coordinate Grid' Layer available by default.

Other options:

Administrative Areas

GADM

GADM is the best source of global Administrative Boundaries:

There are often better local sources for specific countries, although getting hold of these can be difficult. Note that some countries have boundaries which change frequently and so datasets can often be a little out of date.

Import into PostGIS:

apt-get -y install gdal-bin pgloader
mkdir GADMv1
cd GADMv1
wget http://gadm.org/data/gadm_v1_lev0_shp.zip
wget http://gadm.org/data/gadm_v1_lev1_shp.zip
wget http://biogeo.ucdavis.edu/data/gadm/gadm_v1_lev2_shp.zip
unzip gadm_v1_lev0_shp.zip
unzip gadm_v1_lev1_shp.zip
unzip gadm_v1_lev2_shp.zip
ogr2ogr -f CSV CSV gadm1_lev0.shp -lco GEOMETRY=AS_WKT
ogr2ogr -f CSV CSV2 gadm1_lev1.shp -lco GEOMETRY=AS_WKT
ogr2ogr -f CSV CSV3 gadm_v1_lev2.shp -lco GEOMETRY=AS_WKT
mv CSV2/gadm1_lev1.csv CSV
mv CSV3/gadm_v1_lev2.csv CSV
# Fix encodings! http://ww.gadm.org/node/287
python utf8.py
rm -rf CSV
rm -rf CSV2
rm -rf CSV3
cat << EOF > "/home/GADMv1/pgloader.conf"
[pgsql]
host    =   localhost
port    =   5432
base    =   sahana
user    =   sahana
pass    =   eden

client_encoding = 'utf-8'

[ogr_tmpl]
template     = True
format       = csv
field_sep    = ,
quotechar    = "
trailing_sep = False

[l1]
use_template    = ogr_tmpl
table           = public.gis_location
filename        = gadm1_lev0_utf8.csv
columns         = wkt:1, name:6, code:22, area:28
only_cols       = 1, 6, 22, 28
skip_head_lines = 1

EOF
vim /usr/share/pyshared/pgloader/csvreader.py
csv.field_size_limit(2**20 * 100)  # 100 megs

pgloader

ToDo: Get this imported into Sahana's gis_location table for consistency across basemap & queries (naming & boundaries).

This can then be served as 3 WMS layers using GeoServer & GeoWebCache.

Example geowebcache.xml for GADM attached which provides 3 layers:

  • L0 Base
  • L0-L2 Base
  • L0-L2 Overlay
    cp geowebcache.xml /var/gis/geoserver_data/gwc
    /etc/init.d/tomcat6 restart
    

Population Density

GPWv3

Gridded Population of the World, version 3 (GPWv3) is the standard global dataset for both measured & projected population densities.

Download gl_gpwfe_pdens_10_wrk_25.zip from (requires registration):

mkdir /home/data/GPWv3
cd /home/data/GPWv3
unzip gl_gpwfe_pdens_10_wrk_25.zip
/usr/local/bin/gdal_translate -of GTiff glfedens10/glds10ag/hdr.adf glds10ag.tif
ln -s /home/data/GPWv3 /var/lib/tomcat6/webapps/geoserver/data/coverages/GPWv3

Add new GeoTIFF Store to GeoServer to serve as WMS:

URL: file:coverages/GPWv3/glds10ag.tif

Style:

GRUMPv1

Global Rural-Urban Mapping Project, version 1 (GRUMPv1) is a newer dataset which combines satellite with census data to locate people within settlements rather than just administrative areas:

Topographic Maps

WMS

Topography can be rendered as WMS using e.g. GeoServer

Download the SRTMs in GeoTIFF format from ftp://xftp.jrc.it/pub/srtmV4/tiff

  • get.sh can assist with this
  • Beware this is a *lot* of data! (At least 60Gb for a global dataset)
    • The GeoTIFFs are already well compressed
    • Partial datasets are, of course, possible (@ToDo: A script to select an area & download just that area)
mkdir /tmp/SRTMv4
cd /tmp/SRTMv4
sh get.sh

Unzip the data into a folder called 'SRTMv4' in the GeoServer 'coverages' folder (or use a symlink):

cd /var/lib/tomcat6/webapps/geoserver/data/coverages
mkdir SRTMv4
unzip -o /tmp/SRTMv4/\*.zip
rm /var/lib/tomcat6/webapps/geoserver/data/coverages/SRTMv4/*.hdr
rm /var/lib/tomcat6/webapps/geoserver/data/coverages/SRTMv4/*.tfw
rm /var/lib/tomcat6/webapps/geoserver/data/coverages/SRTMv4/readme.txt

Give Tomcat permission to the folder:

chown tomcat6 /var/lib/tomcat6/webapps/geoserver/data/coverages/SRTMv4

Configure GeoServer by adding a new Store using the Image mosaicking plugin

  • URL: file:coverages/SRTMv4

Publish (defaults OK)

Styling:

Test out using the direct WMS URL:

Once happy then start using the GeoCache URL (but don't do this too early as otherwise you have to invalidate the cache to see your changes):

OpenStreetMap

Contours can be rendered using OSM tools:

http://de.wikipedia.org/wiki/Benutzer:Alexrk2/SRTM-Reliefs

  • suggests using GIMP's emboss filter! Azimuth = 135, height = 50, depth = 10

Old Printed Maps

Old Printed Maps can be 'Rectified' to be overlaid on the base maps:

OpenStreetMap

Base Map

We have out-of-the-box the ability to use OpenStreetMap Tiles as base layer.

This can include local OSM sites (OSM Taiwan is included as an example)

Vector Overlays

Can have OSM Vectors displayed over the top of other Base Layers (e.g. Satellite Images)

Import

We have an XSLT stylesheet to import .osm files

e.g for hospitals and clinics:

osmosis --read-xml country.osm --tf accept-nodes amenity=hospital,clinic --tf reject-ways --tf reject-relations --write-xml nodes.osm
osmosis --read-xml country.osm --tf reject-relations --tf accept-ways amenity=hospital,clinic --used-node --write-xml ways.osm
osmosis --rx nodes.osm --rx ways.osm --merge --wx country_hospitals.osm
http://myhost.com/eden/hms/hospital/create.osm?filename=country_hospitals.osm

This needs more work to understand the admin hierarchy properly to be able to import Places.

Geofabrik have updated extracts daily for Pakistan:

Otherwise pull a BBOX directly using Osmosis:

Osmosis requires Java. Python options for filtering based on tag, which would be more suitable for integration within Sahana, however we need to add Polygon filtering using Shapely:

Ruby script to generate KML of recently-added locations by a group of users:

Basemap for Garmin GPS

PostgreSQL management

PostGIS functions

  • Centroids
    SELECT name, iso2, asText(ST_Transform(ST_Centroid(the_geom), 4326)) AS centroid FROM countries;
    

Data Sources

L0

L1

L2

OGC (WMS/WFS)


GIS

Attachments (12)

Download all attachments as: .zip

Note: See TracWiki for help on using the wiki.