| 1 | 1. Personal Details |
| 2 | |
| 3 | Name: Ashwyn Sharma |
| 4 | |
| 5 | Email: ashwyn1092@gmail.com |
| 6 | |
| 7 | Freenode IRC Nickname: ashwyn |
| 8 | |
| 9 | Skype: ashwyn sharma |
| 10 | |
| 11 | Age:19 |
| 12 | |
| 13 | Education: Currently pursuing B.E.(Bachelor in Engineering) from N.S.I.T, New Delhi. |
| 14 | |
| 15 | Country:India |
| 16 | |
| 17 | Timezone: GMT +0530 |
| 18 | |
| 19 | Linked In Profile: http://in.linkedin.com/pub/ashwyn-sharma/37/bb2/777 |
| 20 | |
| 21 | Exposure To Similar Technologies and/or FOSS in general : |
| 22 | |
| 23 | My work experience with FOSS was pretty limited (when I started to contribute for Sahana), but I have spent a past few years developing a great understanding of its modus operandi , thus giving me a sufficient exposure to the whole concept of Free Open Source Software (FOSS).However , my involvement with the Sahana Software Foundation for the last two months has given me tremendous experience with Python and the web2py technology in particular. |
| 24 | |
| 25 | Why would you like to help the Sahana project?: |
| 26 | |
| 27 | As we rightly know that the Sahana Software Foundation enable organisations and communities to better prepare for and respond to disasters.In the process,they save a million lives through its information management solutions. Contributing to a cause as noble as this just adds that extra motivation or rather a purpose behind all the coding and development process that goes on during the summer. Living in a country like India which is prone to several natural hazards and disasters ; and having witnessed one myself in http://en.wikipedia.org/wiki/2001_Gujarat_earthquake ,it makes me understand the need of a deployment tool like Sahana Eden in a more realisable way. Moreover, I really believe that my work with the Sahana community so far helps me contribute to Sahana and the Eden Project in particular in the coming future. |
| 28 | |
| 29 | |
| 30 | |
| 31 | |
| 32 | |
| 33 | |
| 34 | |
| 35 | 2. Personal Availability |
| 36 | |
| 37 | Have you reviewed the timeline for GSoC 2012? |
| 38 | |
| 39 | Yes, I have reviewed the timeline for GSoC 2012. |
| 40 | |
| 41 | Do you have any significant conflicts with the listed schedule? If so, please list them here. |
| 42 | |
| 43 | No,not really.My semester exams are in the second half of May,2012.However, the coding starts on 21st May according to the timeline.So, I will not be losing any significant time.Moreover, I would be more than happy to start coding before that period so that I will be able to compensate for the few lost days |
| 44 | |
| 45 | Will you need to finish your project prior to the end of the GSOC? |
| 46 | |
| 47 | No, my project is planned to be developed along the whole summer. |
| 48 | |
| 49 | Are there any significant periods during the summer that you will not be available? |
| 50 | |
| 51 | Apart from the conflict listed above, I will be completely available throughout the summer |
| 52 | |
| 53 | |
| 54 | |
| 55 | |
| 56 | |
| 57 | |
| 58 | |
| 59 | |
| 60 | 3. Project Abstract |
| 61 | |
| 62 | |
| 63 | The essential requirement for this project is to parse inbound messages , with an initial focus to SMS. The project is specifically aimed at the CERT usecase where they wish to process responses to deployment notifications. Or in other words , to handle replies to deployment requests. Currently the message parsing is done in the core code i.e. modules/s3/s3msg.py ,to be particular, in the parse_message() method. The parsing rules will be defined in private/prepopulate which allows for hosting of multiple profile options in the main code.Now,s3parsing.py can import these parsing rules from prepopulate.This also enforces the on-going work in the development of the Profile Layer , in which deployment-specific files are separated from core code.The parsing module utilizes a data model "msg_workflow” to link the source and the workflow to schedule tasks. Processing of OpenGeoSMS encoded messages is also an important area to work on especially for the existing Android Client, for which it will be of real use. Also to provide robustness and extend the existing code , the pyparsing Parser module can be incorporated or any other parsing generator ; which will be subjective to the parsing needs. |
| 64 | |
| 65 | |
| 66 | |
| 67 | |
| 68 | |
| 69 | 4. Project Plan |
| 70 | |
| 71 | Project Deliverable: |
| 72 | |
| 73 | The project aims at parsing inbound messages such as SMS from CERT responders after deployment. |
| 74 | It enables the processing of responses to deployment notifications;which is essentially controlled by the module to which the message is routed. |
| 75 | |
| 76 | Project Justification: |
| 77 | |
| 78 | Parsing of inbound messages is a critical utility for a trained volunteer group such as CERT(Community Emergency Response Teams) where communication between various deployments and volunteers play a vital role. As this will be a deployment-specific option, the functionality becomes an important component for Sahana Eden. |
| 79 | |
| 80 | Implementation Plan: |
| 81 | Keeping the development of Profile Layer in mind and the functionality being a part of deployment-specific options, the rules for parsing are contained in private/prepopulate from where s3parsing.py imports them. |
| 82 | The module contains a class S3ParsingModel which contains the “msg_workflow” data model (See https://docs.google.com/document/d/1Y9dDCshurrZSw33r-RC_uVQ_Va6_LEZM-2aLcaT2Krc/edit?pli=1) and another class S3Parsing in which the parsing routines are defined which decide the various parsing workflows. |
| 83 | The current parsing rules implement the functionality in the following manner: |
| 84 | The inbound message text is passed as an argument to the parse_message() method in the s3msg.py module. |
| 85 | The text is matched with a predefined list of primary and contact keywords after splitting with whitespace as the delimiter. |
| 86 | A database query is generated to the concerned database according to the matched keywords. |
| 87 | The query retrieves the relevant field values and generates a reply to the inbound message query. |
| 88 | Also these parsing rules have been implemented only for modules – ‘Person’ , ‘Hospital’ and ‘Organisation’. |
| 89 | Extending these rules to other modules can be in scope of the project. |
| 90 | |
| 91 | One of the main issues will be identifying the messages that belong to a particular source, so it could have its own processing.Now, that here is handled by the data model which defines a ‘msg_workflow' table in the database which links the Source to the Workflow with any required args.So the essential features of this approach have been listed below: |
| 92 | The Parser workflow table links 'SMS Source X' to 'Workflow Y'. |
| 93 | Now, designing the details of the Workflow Y would be a developer task. |
| 94 | Whereas linking ‘SMS X’ to ‘Workflow Y’ will be a configurable option. |
| 95 | So essentially,the Parser Table links Source to Workflow with any other required args & this acts like a Template for the schduler_task table. |
| 96 | Now, a task process_log() is defined in tasks.py , where the objective of process_log() is to scan through all the messages in msg_log; and process those for parsing which are flagged as unparsed (is_parsed=False).The task is scheduled in zzz_1st_run.py where it is chained to the concerned parsing task(this is achieved by the msg_workflow table, the ‘source_task_id’ field in msg_log will help retrieve the respective parsing workflow_task_id from msg_workflow). |
| 97 | Also,this allows for chaining of workflows where a source for a workflow could be another workflow instead of an Incoming source.We can have 2nd-pass Parser workflows which don't start from the Source direct but can plugged as output from a 1st-pass one. |
| 98 | Source -> process_log() ->1st pass parser -> detailed Parser ---> Module |
| 99 | Here,the 1st pass parser is customized per-deployment;and decides which email source goes to a particular workflow (simple msg_workslow link) or decides based on other factors such as keywords to which main workflow the messages should be passed. |
| 100 | The data model is integrated with the prepopulate folders (or a sub-folder say private/prepopulate/parsing) which serves as the initial UI.The post-install UI will consist of a CRUD interface admin panel, a simple s3_rest_controller().However, eventually this is planned to be the part of the WebSetup. |
| 101 | We want to be able to direct the message to the appropriate module to handle the data.This could be done either by launching a real REST request or else simulating one via the API. |
| 102 | resource = s3mgr.define_resource("module", "resourcename") |
| 103 | Messages which are routed to a specific resource can be subscribed to by the user.For this purpose,we can use the existing Save Search and Subscription functionality where the user can subscribe to new messages for a specific resource using a resource filter.The msg_log can be made a component for the resources.Now,if it's a component, then when someone opens the resource, messages will be there in a tab.Also, if the message has to be tied to multiple resources, then we can use a relationship (link) table. |
| 104 | Implementing/extending the utility for other modules especially the IRS module will be of real use, where enabling to log reports through SMS will be vital, which can also use the OpenGeoSMS encoding standards(LatLon generates a google-maps URL) for integration with our Android Client. A dedicated routine to generate OpenGeoSMS URLs already exists in prepare_opengeosms() in s3msg.py itself. So integration with the parsing routine won’t be difficult. Other modules for which this can be implemented are : ‘Request’ and ’Inventory’. |
| 105 | Finally the code will be tested on the system and the bugs (if any ;-) ) will be fixed. |
| 106 | |
| 107 | |
| 108 | |
| 109 | Future Options: |
| 110 | Though the parsing rules will be generic , a few minor tweaks for other processes such as Email and Twitter will have to be performed to maintain its generic nature. |
| 111 | One of the most valuable functionality that can be added here is to make the SMS communication more interactive. e.g. the text body received does not match any of the expected keywords , the API dispatches a reply stating the expected format of the message. |
| 112 | Adapting the parsing rules to cover as wide a base of inbound messages as possible. This will involve making a wider collection of keywords to be searched for every concerned module.Linking different labels across the DB to module-specific keywords will be really helpful.Also the list of primary keywords to be matched can also be made a deployment-specific option. |
| 113 | |
| 114 | |
| 115 | Relevant Experience: |
| 116 | |
| 117 | I have developed a thorough understanding of the existing parsing routine in the application. Also, I am comfortable using various parsing generators. I have discussed many of the ideas in the proposal with the mentors and rest of the community. |
| 118 | My experience with the Sahana community has been very enjoyable so far, Sahanathon being one of the highlights where I got the opportunity to demonstrate my ability to work with the code and contribute to Sahana Eden. My notable contributions so far have been listed below: |
| 119 | I solved the bug #1132 in the Trac (http://eden.sahanafoundation.org/ticket/1132) which was merged during the Sahanathon itself. Pull request: https://github.com/flavour/eden/pull/31 |
| 120 | Reported and fixed a defect with the update (“Open”) button in the saved searches table. |
| 121 | Made milestones in the project task workflow a deployment-specific option. (See https://github.com/flavour/eden/pull/35 ). |
| 122 | Fixed email_settings() in the msg controller and required changes in the menu. (See https://github.com/flavour/eden/pull/42 ). |
| 123 | |
| 124 | |
| 125 | |
| 126 | |
| 127 | |
| 128 | 5. Project Goals and Timeline |
| 129 | |
| 130 | |
| 131 | |
| 132 | Work Already Undertaken: |
| 133 | |
| 134 | Currently parsing is implemented by the parse_message() method in the s3msg.py module, though its usage is limited or rather unimplemented as of now. |
| 135 | Also, the current method is hard-coded and inefficeient to handle different processes. |
| 136 | A dedicated data model has been developed with consent of the mentors.The msg_workflow has been defined exhaustively in the implementation details and also in the linked gdoc. |
| 137 | Mechanism to route messages to resources has also been designed and discussed. |
| 138 | |
| 139 | First trimester: |
| 140 | |
| 141 | |
| 142 | Due Date -SMART Goal-Measure |
| 143 | (24th April - 7th May)- |
| 144 | Development of the workflow handling module s3parsing.py starts. |
| 145 | - |
| 146 | 1.Community bonding period: |
| 147 | I have been involved with the community for some time now, so won’t take a lot of time :) |
| 148 | 2.Decision to outline the template. |
| 149 | |
| 150 | (8th May – 21st May)- |
| 151 | The msg_workflow data model is developed. |
| 152 | S3ParsingModel starts to take shape. |
| 153 | -Code committed locally. |
| 154 | |
| 155 | |
| 156 | |
| 157 | |
| 158 | |
| 159 | |
| 160 | Second Trimester: |
| 161 | |
| 162 | Due Date- SMART Goal -Measure |
| 163 | 28th May -*(won’t be available due to university exams ) |
| 164 | |
| 165 | 4th June- |
| 166 | CERT:Deployment Request SMS Handler development starts ( critical requirement of the project). |
| 167 | Parsing workflow is developed. |
| 168 | - SMS response processing starts to take shape. |
| 169 | |
| 170 | 11th June- |
| 171 | CERT:Deployment Request SMS Handler development continues ( critical requirement of the project). |
| 172 | - |
| 173 | 1.Code committed locally. |
| 174 | 2.Tested on local system. |
| 175 | |
| 176 | 17th June- |
| 177 | process_log() method is designed. |
| 178 | Tweaks in msg_log implemented. |
| 179 | -Code committed to trunk. |
| 180 | |
| 181 | 2nd July - |
| 182 | Parsing workflow is chained to process_log(). |
| 183 | Sources are linked to respective workflows. |
| 184 | -Code committed to trunk. |
| 185 | |
| 186 | 8th July - |
| 187 | Clickatell functionality is developed for eden. |
| 188 | Clickatell allows for a more robust testing mechanism. |
| 189 | Commits are altered with mentor feedback and suggested changes. |
| 190 | |
| 191 | - |
| 192 | 1.Commits so far tested thoroughly. |
| 193 | 2.Bug fixing. |
| 194 | 3.Code committed to trunk. |
| 195 | |
| 196 | [Mid-Term evaluations :-) ] |
| 197 | |
| 198 | |
| 199 | |
| 200 | |
| 201 | |
| 202 | Third Trimester: |
| 203 | |
| 204 | Due Date-SMART Goal-Measure |
| 205 | |
| 206 | 15th July- |
| 207 | Integration with prepopulate folders. |
| 208 | Development of post-install UI (CRUD interface admin panel). |
| 209 | - Code committed locally. |
| 210 | |
| 211 | 22nd July- |
| 212 | Routing mechanism to resources. |
| 213 | Save search and subscription implemented for msg_log. |
| 214 | -Code committed to trunk. |
| 215 | |
| 216 | 29th July- |
| 217 | OpenGeoSMS (process_opengeosms() ) routine is tweaked and linked with the respective parsing methods. |
| 218 | The existing functionality in the Android Client is tested . |
| 219 | -Code committed to trunk. |
| 220 | |
| 221 | 5th August- |
| 222 | Integration of the IRS module for incident reporting through SMS. |
| 223 | The functionality is extended to other modules (if needed). |
| 224 | Feedback from mentors. |
| 225 | -Code committed locally. |
| 226 | |
| 227 | 13th August- |
| 228 | System testing & Bug fixing. |
| 229 | Final changes to the code are applied. |
| 230 | - |
| 231 | 1.Project reaches final stage. |
| 232 | 2.Bug fixes |
| 233 | 3.Final Code committed to trunk. |
| 234 | |
| 235 | 20th August- PENCILS DOWN! -Project Completed. :-) |