Version 11 (modified by 11 years ago) ( diff ) | ,
---|
Table of Contents
Quality Assurance
There are a number of initiatives to assure the quality of Sahana Eden, however if you find any issues, please report them following the Bug Reporting Guidelines.
Testing
"A bug is a test case you haven't written yet"
"Unit Tests allow merciless refactoring"
This page defines what our current approach versus our BluePrint for future options
Test-Driven Development is a programming style which says that you 1st write your test cases (from the specs) & then proceed to make them pass.
Automated Tests
Smoke Tests
Smoke Tests click through on every link within Sahana Eden and can be used to check for errors on pages and broken links. They are a light-weight approach to detecting basic errors, however they do not test form submission or any interaction. To run the Smoke Tests:
python web2py.py -S eden -M -R applications/eden/modules/tests/suite.py -A --suite smoke --force-debug --link-depth 16 -V 3
Add details on the dependancies for these
Selenium Tests
Selenium Tests use the Selenium WebDriver to simulate user interactions within a browser. They are very thorough as they test interactions in the entire Sahana Eden stack including JS, but can also be fragile (detect false negatives). To run all the Selenium Tests:
python web2py.py -S eden -M -R applications/eden/modules/tests/suite.py
See: Automated Tests - Selenium for more details
Unit Tests
Unit Tests can be used to test whether specific "Units" of code are working. They are used extensively to test the Sahana Eden "S3" Framework. Unit Tests require running with the IFRC_Train preopulate, 'settings.base.prepopulate = 27': To run all Unit Tests:
python web2py.py -S eden -M -R applications/eden/modules/unit_tests/suite.py
These unit tests are meant to detect problems early during development, which means you should run them quite often while you're still working on the code (read: before and after every little change you make) rather than expecting them to be run during QA cycles. That again means you would more often need to run tests for particular modules than the whole suite.
For every module in modules/s3
and modules/s3db
, you can find the corresponding unit test module under modules/unit_tests/s3db
resp. modules/unit_tests/s3
(if one exists).
To run tests for particular modules:
# e.g. for modules in modules/s3 python web2py.py -S eden -M -R applications/eden/modules/unit_tests/s3/s3resource.py # e.g. for modules in modules/s3db python web2py.py -S eden -M -R applications/eden/modules/unit_tests/s3db/pr.py
It can be a very powerful development strategy - especially for back-end APIs - to first implement unit test cases for the functionality you intend to implement before actually implementing it. Apart from preventing bugs, this helps you to validate your design against requirements, and to keep the implementation simple and focussed. Additionally, the test cases can be a rich source of code samples how to apply your API methods.
Role Tests
Role tests are used to check the permissions of user roles. Currently is limited to the IFRC (Red Cross) roles for RMS, but could be extended: To run the Role Tests:
python web2py.py -S eden -M -R applications/eden/modules/tests/suite.py -A --suite roles
Benchmark Tests
The Benchmark Tests are a simple way of measuring the performance of the Sahana Eden "S3" Framework. The result of these tests will give you a measure of the performance relative to the system running them. To run the Benchmark Tests:
python web2py.py -S eden -M -R applications/eden/modules/unit_tests/s3/benchmark.py
Load Tests
Load Testing will measure the performance of the entire Sahana Eden stack. We recommend using Tsung
Continuous Integration (CI)Server
The CI Server will constantly run all Automated Tests on the latest version of Sahana Eden to detect any defects. See: SysAdmin/ContinuousIntegration for how we have set up the CI Server and help to find the exact commands to run the tests.
Test Cases
See: TestCases for more details - this page needs updating
- Sahana Eden Test Cases Spreadsheet
- Test Cases from Historic Deployments
See Also
Systers' approach:
- http://systers.org/systers-dev/doku.php/automated_functional_testing
- List of Tests: http://systers.org/systers-dev/doku.php/master_checklist_template
- GSoC project: http://systers.org/systers-dev/doku.php/svaksha:patches_release_testing_automation
Alternative Options:
- http://zesty.ca/scrape/
- Lightning Talk (2.30)
- Lightning Talk (2.30)
- Lightning Talk (2.30)