Version 52 (modified by 14 years ago) ( diff ) | ,
---|
Table of Contents
Pakistan Deployment Cycle
Process
We do daily upgrades of the systems from a single Fabric script:
- Upgrade Live with code release from UAT
- Refresh the UAT database with data from Live
- Upgrade UAT to Trunk code
- Send a notification to the List with a summary of the changes on both Test & Live
Instructions
Login to eden.sahanafoundation.org
1st time:
cd /home/release fab generate_keys fab test distribute_keys fab prod distribute_keys
Subsequently:
cd /home/release fab prod deploy fab test deploy
Make a note of any upgrade issues with the migration on Test so that they can be streamlined in tomorrow's migration on Live
Sysadmin ToDo
- Current Fabfile: http://gist.github.com/567231
- Current Aliases: ConfigurationGuidelines#Usefulaliases
1. Upgrade Live with code release from UAT
- Read VERSION from UAT to know which revision to pull to live
- maintenance_on()
- migrate_on()
- pull()
- the script should fall back to 'bzr pull' but support an 'update XXX' arg to 'bzr pull -r XXX'
- Extend the cleanup() line to cover all files (in Models at least)
- either parse the bzr output or search filesystem - whichever is easier/quicker
- migrate
- How to we pass input into the PTY?
- check for migration failures in databases/sql.log
- resolve any migration failures
- we should be able to have a script developed during the UAT upgrade to do this automatically
- migrate_off()
- maintenance_off()
2. Refresh the UAT database with data from Live
- Include 'uploads' folder
- Need to ensure that User Accounts in Test are not overwritten
- Need to ensure that Role memberships in Live & Test can be different
- Maybe add generic role accounts in a script after the DB replaced?
3. Upgrade UAT to Trunk code
- pull
- check for conflicts & copy all .THIS over (either parse the bzr output or search filesystem - whichever is easier/quicker)
- migrate (CLI web2py load as 'su web2py')
- check for migration failures in databases/sql.log
- resolve any migration failures
- let user know which table failed (in sql.log)
- launch a mysql prompt with 'show innodb status;' (parsed?)
- potentially even have mysql fix it automatically (possible for sure, but lower priority than the core)
- migrateoff
4. Send a notification to the List with a summary of the changes on both Test & Live
Notifications can be built with info from the Trac Timeline
- Investigate a custom Trac script to build the report automatically, e.g. building on these:
General
Set deployment_settings on UAT to the same as Prod- Add rollback() by reading VERSION before
bzr pull
, so then canbzr revert -r $version
Add update() for debian packages: SSH into each &apt-get update; apt-get upgrade
- Enhance Apache Maintenance site
- allowing access to site through a browser - but using a different name (which we don't publish)
/etc/apache2/sites-available/maintenance
- improving the text on the maintenance page:
/var/www/maintenance.html
- allowing access to site through a browser - but using a different name (which we don't publish)
- dev.pakistan.sahanafoundation.org instance needs adding to the Fabfile
- This shouldn't be fully-automated into the upgrades cycle, but does have a script to refresh data from live manually
- dev. will be postgres!
Live
- Schedule the Ushahidi imports:
- http://pakistan.sahanafoundation.org/eden/irs/ireport/ushahidi
- http://pakreport.org/ushahidi/api?task=incidents&by=all&resp=xml&limit=1000 (how to avoid this?)
- Can we pass URL as argument?
- Upgrade Geraldo to 0.4.0 (currently 0.3.9)
- Upgrade ReportLab from the debian-packaged 'python-reportlab 2.1dfsg-2' to current 2.4
- Get MapProxy working (basic install on 'geo' done)
Demo
- Update Demo (whilst keeping the logins there intact - all other data can be dropped)
Trac
- Investigate a fix or alternative to http://trac-hacks.org/wiki/MathCaptchaPlugin for allowing Trac users to register bugs anonymously whilst not locking out our testing team.
- Convert from sqlite to PostgreSQL (or MySQL) to improve performance
Note:
See TracWiki
for help on using the wiki.