Changes between Version 5 and Version 6 of Event/2012/GSoC/MessageParsing


Ignore:
Timestamp:
06/06/12 16:09:53 (9 years ago)
Author:
Fran Boon
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Event/2012/GSoC/MessageParsing

    v5 v6  
    11=== 1. Personal Details ===
    22 
    3 Name:  Ashwyn Sharma[[BR]]
    4 
    5 Email: ashwyn1092@gmail.com[[BR]]
    6 
    7 Freenode IRC Nickname: ashwyn[[BR]]
    8 
    9 Skype: ashwyn sharma[[BR]]
    10 
    11 Age:19[[BR]]
    12 
    13 Education: Currently pursuing B.E.(Bachelor in Engineering) from N.S.I.T, New Delhi.[[BR]]
    14 
    15  
    16 Country:India[[BR]]
    17 
    18  
    19 Timezone: GMT +0530[[BR]]
    20 
    21  
    22 Linked In Profile: http://in.linkedin.com/pub/ashwyn-sharma/37/bb2/777[[BR]]
    23 
    24  
    25 Exposure To Similar Technologies and/or  FOSS in general :[[BR]]
    26 
    27  
    28 My work experience with FOSS was pretty limited (when I started to contribute for Sahana), but I have  spent a past few years developing a great understanding of its modus operandi , thus giving me a sufficient exposure to the whole concept of Free Open Source Software (FOSS).However , my involvement with the Sahana Software Foundation for the last two months has given me tremendous experience with Python and the web2py technology in particular.[[BR]]
    29 
    30  
    31 Why would you like to help the Sahana project?:[[BR]]
    32 
    33  
     3Name:  Ashwyn Sharma
     4
     5Email: ashwyn1092@gmail.com
     6
     7Freenode IRC Nickname: ashwyn
     8
     9Skype: ashwyn sharma
     10
     11Age: 19
     12
     13Education: Currently pursuing B.E.(Bachelor in Engineering) from N.S.I.T, New Delhi.
     14
     15 
     16Country: India
     17
     18Timezone: GMT +0530
     19
     20Linked In Profile: http://in.linkedin.com/pub/ashwyn-sharma/37/bb2/777
     21
     22Exposure To Similar Technologies and/or  FOSS in general:
     23 
     24My work experience with FOSS was pretty limited (when I started to contribute for Sahana), but I have  spent a past few years developing a great understanding of its modus operandi , thus giving me a sufficient exposure to the whole concept of Free Open Source Software (FOSS).However , my involvement with the Sahana Software Foundation for the last two months has given me tremendous experience with Python and the web2py technology in particular.
     25
     26Why would you like to help the Sahana project?:
     27
    3428As we rightly know that the Sahana Software Foundation enable organisations and communities to better prepare for and respond to disasters.In the process,they save a million lives  through its information management solutions. Contributing to a cause as noble as this just adds that extra motivation or rather a purpose behind all the coding and development process that goes on during the summer. Living in a country like India which is prone to several natural hazards and disasters ; and having witnessed one myself in  http://en.wikipedia.org/wiki/2001_Gujarat_earthquake ,it makes me understand the need of a deployment tool like Sahana Eden in a more realisable way. Moreover, I really believe that my work with the Sahana community so far helps me contribute to Sahana and the Eden Project in particular in the coming future.
    35 [[BR]]
    36  
    37  
    38  
    39  
    40  
    41  
    4229 
    4330=== 2. Personal Availability ===
    4431 
    4532Have you reviewed the timeline for GSoC 2012?
    46 [[BR]]
    4733 
    4834Yes,  I have reviewed the timeline for GSoC 2012.
    49 [[BR]]
    50  
     35 
     36
    5137Do you have any significant conflicts with the listed schedule? If so, please list them here.
    52 [[BR]]
    5338 
    5439No,not really.My semester exams are in the second half of May,2012.However, the coding starts on 21st May according to the timeline.So, I will not be losing any significant time.Moreover, I would be more than happy to start coding before that period so that I will be able to compensate for the few lost days
    55 [[BR]]
     40
    5641 
    5742Will you need to finish your project prior to the end of the GSOC?
    58 [[BR]]
    5943 
    6044No, my project  is planned to be developed along the whole summer.
    61 [[BR]]
     45
    6246 
    6347Are there any significant periods during the summer that you will not be available?
    64 [[BR]]
    6548 
    6649Apart from the conflict listed above, I will be completely available throughout the summer
    67 [[BR]]
    68  
    69  
    70  
    71  
    72  
    73  
    74  
    7550 
    7651=== 3. Project Abstract ===
    77  
    78 [[BR]]
    79  
    80 The essential requirement for this project is to parse inbound messages , with an initial focus to SMS. The project is specifically aimed at the CERT usecase  where they wish to process responses to deployment notifications. Or in other words , to handle replies to deployment requests. Currently  the message parsing is done in the core code i.e. modules/s3/s3msg.py ,to be particular, in the parse_message() method. The parsing rules will be defined in private/prepopulate which allows for hosting of multiple profile options in the main code.Now,s3parsing.py can import these parsing rules from prepopulate.This also enforces the on-going work in the development of the Profile Layer , in which deployment-specific files are separated from core code.The parsing module utilizes a data model "msg_workflow”  to link the source and the workflow to schedule tasks. Processing of OpenGeoSMS encoded messages is also an important area to work on especially for the existing Android Client, for which it will be of real use. Also to provide robustness and extend the existing code , the pyparsing Parser module can be incorporated or any other parsing generator ; which will be subjective to the parsing needs.
    81 [[BR]]
    82  
    83  
    84  
    85  
     52
     53The essential requirement for this project is to parse inbound messages, with an initial focus to SMS. The project is specifically aimed at the CERT usecase where they wish to process responses to deployment notifications. Or in other words, to handle replies to deployment requests. Currently the message parsing is done in the core code i.e. modules/s3/s3msg.py, to be particular, in the parse_message() method. The parsing rules will be defined in private/templates which allows for hosting of multiple profile options in the main code. Now, s3parsing.py can import these parsing rules from templates. This also enforces the on-going work in the development of the Profile Layer, in which deployment-specific files are separated from core code.The parsing module utilizes a data model "msg_workflow”  to link the source and the workflow to schedule tasks. Processing of OpenGeoSMS encoded messages is also an important area to work on especially for the existing Android Client, for which it will be of real use. Also to provide robustness and extend the existing code, the pyparsing Parser module can be incorporated or any other parsing generator; which will be subjective to the parsing needs.
    8654 
    8755=== 4. Project Plan ===
    8856 
    89 Project Deliverable:[[BR]]
    90 
     57Project Deliverable:
    9158 
    9259The project aims at parsing inbound messages such as SMS from CERT responders after deployment.
    93 It enables the processing of responses to deployment notifications;which is essentially controlled by the module to which the message is routed.
    94 [[BR]]
     60It enables the processing of responses to deployment notifications; which is essentially controlled by the module to which the message is routed.
    9561
    9662Project Justification:
    97 [[BR]]
    9863 
    9964Parsing of inbound messages is a critical utility for a trained volunteer group such as CERT(Community Emergency Response Teams) where communication between various deployments and volunteers play a vital role. As this will be a deployment-specific option, the functionality becomes an important component for Sahana Eden.
    100 [[BR]]
    101  
    10265
    10366Implementation Plan:
    104 [[BR]]
    105 
    106 *Keeping the development of Profile Layer in mind and the functionality being a part of deployment-specific options, the rules for parsing are contained in private/prepopulate from where s3parsing.py imports them.
    107 [[BR]]
    108 *The module contains a class S3ParsingModel which contains the “msg_workflow” data model (See https://docs.google.com/document/d/1Y9dDCshurrZSw33r-RC_uVQ_Va6_LEZM-2aLcaT2Krc/edit?pli=1) and another class S3Parsing in which the parsing routines are defined which decide the various parsing workflows.
    109 [[BR]]
     67
     68* Keeping the development of Profile Layer in mind and the functionality being a part of deployment-specific options, the rules for parsing are contained in private/templates from where s3parsing.py imports them.
     69
     70* The module contains a class S3ParsingModel which contains the “msg_workflow” data model (See https://docs.google.com/document/d/1Y9dDCshurrZSw33r-RC_uVQ_Va6_LEZM-2aLcaT2Krc/edit?pli=1) and another class S3Parsing in which the parsing routines are defined which decide the various parsing workflows.
     71
    11072The current parsing rules implement the functionality in the following manner:
    111 [[BR]]
    112 1.The inbound message text is passed as an argument to the parse_message() method in the s3msg.py module.
    113 [[BR]]
    114 2.The text is matched with a predefined list of primary and contact keywords after splitting with whitespace as the delimiter.
    115 [[BR]]
    116 3.A database query is generated to the concerned database according to the matched keywords.
    117 [[BR]]
    118 4.The query retrieves the relevant field values and generates a reply to the inbound message query.
    119 [[BR]]
    120 5.Also these parsing rules have been implemented only for modules –  ‘Person’ , ‘Hospital’  and ‘Organisation’.
    121 [[BR]]
     731. The inbound message text is passed as an argument to the parse_message() method in the s3msg.py module.
     742. The text is matched with a predefined list of primary and contact keywords after splitting with whitespace as the delimiter.
     753. A database query is generated to the concerned database according to the matched keywords.
     764. The query retrieves the relevant field values and generates a reply to the inbound message query.
     775. Also these parsing rules have been implemented only for modules –  ‘Person’ , ‘Hospital’  and ‘Organisation’.
    12278Extending these rules to other modules can be in scope of the project.
    12379 
    124 [[BR]]
    125 [[BR]]
    12680*One of the main issues will be  identifying the messages that belong to a particular  source, so it could have its own processing.Now, that here is handled by the data model which defines a ‘msg_workflow' table in the database which links the Source to the Workflow with any required args.So the essential features of this approach have been listed below:
    127 [[BR]]
    128 1.The Parser workflow table links 'SMS Source X' to 'Workflow Y'.
    129 [[BR]]
    130 2.Now, designing the details of the Workflow Y would be a developer task.
    131 [[BR]]
    132 3.Whereas linking ‘SMS X’ to ‘Workflow Y’ will be a configurable option.
    133 [[BR]]
    134 4.So essentially,the Parser Table links Source to Workflow with any other required args & this acts like a Template for the schduler_task table.
    135 [[BR]]
    136 
    137 *Now, a task process_log() is defined in tasks.py , where the objective of process_log() is to scan through all the messages in msg_log; and process those for parsing which are flagged as unparsed (is_parsed=False).The task is scheduled in zzz_1st_run.py where it is chained to the concerned parsing task(this is achieved by the msg_workflow table, the ‘source_task_id’ field in msg_log will help retrieve the respective parsing workflow_task_id from msg_workflow).
    138 [[BR]]
    139 *Also,this allows for chaining of workflows where a source for a workflow could be another workflow instead of an Incoming source.We can have 2nd-pass Parser workflows which don't start from the Source direct but can plugged as output from a 1st-pass one.
     811. The Parser workflow table links 'SMS Source X' to 'Workflow Y'.
     822. Now, designing the details of the Workflow Y would be a developer task.
     833. Whereas linking ‘SMS X’ to ‘Workflow Y’ will be a configurable option.
     844. So essentially,the Parser Table links Source to Workflow with any other required args & this acts like a Template for the schduler_task table.
     85
     86* Now, a task process_log() is defined in tasks.py , where the objective of process_log() is to scan through all the messages in msg_log; and process those for parsing which are flagged as unparsed (is_parsed=False).The task is scheduled in zzz_1st_run.py where it is chained to the concerned parsing task(this is achieved by the msg_workflow table, the ‘source_task_id’ field in msg_log will help retrieve the respective parsing workflow_task_id from msg_workflow).
     87* Also,this allows for chaining of workflows where a source for a workflow could be another workflow instead of an Incoming source.We can have 2nd-pass Parser workflows which don't start from the Source direct but can plugged as output from a 1st-pass one.
    14088            Source -> process_log() ->1st pass parser -> detailed Parser ---> Module
    141 [[BR]]
    142 *Here,the 1st pass parser is customized per-deployment;and decides which email source goes to a particular workflow (simple msg_workslow link) or decides based on other factors such as keywords to which main workflow the messages should be passed.
    143 [[BR]]
    144 *The data model is  integrated with the prepopulate folders (or a sub-folder say private/prepopulate/parsing) which serves as the initial UI.The post-install UI will consist of a CRUD interface admin panel, a simple s3_rest_controller().However, eventually this is planned to be the part of the WebSetup.
    145 [[BR]]
    146 *We want to be able to direct the message to the appropriate module to handle the data.This could be done either by launching a real REST request or else simulating one via the API.
    147              resource = s3mgr.define_resource("module", "resourcename")
    148 [[BR]]
    149 *Messages which are routed to a specific resource can be subscribed to by the user.For this purpose,we can use the existing Save Search and Subscription functionality where the user can subscribe to new messages for a specific resource using a resource filter.The msg_log can be made a component for the resources.Now,if it's a component, then when someone opens the resource, messages will be there in a tab.Also, if the message has to be tied to multiple resources, then we can use a relationship (link) table.
    150 [[BR]]
    151 *Implementing/extending  the utility for other modules especially the IRS module will be of real use, where enabling to log reports through SMS will be vital, which can also use the OpenGeoSMS encoding standards(LatLon generates a google-maps URL) for integration with our Android Client. A dedicated routine to generate OpenGeoSMS URLs already exists in prepare_opengeosms() in s3msg.py itself. So integration with the parsing routine won’t be difficult. Other modules for which this can be implemented are : ‘Request’  and ’Inventory’.
    152 [[BR]]
    153 *Finally the code will be tested on the system and the bugs (if any ;-) ) will be fixed.
    154  
    155  
    156 [[BR]]
    157  
     89* Here,the 1st pass parser is customized per-deployment;and decides which email source goes to a particular workflow (simple msg_workslow link) or decides based on other factors such as keywords to which main workflow the messages should be passed.
     90* The data model is  integrated with the templates folders (or a sub-folder say private/templates/parsing) which serves as the initial UI.The post-install UI will consist of a CRUD interface admin panel, a simple s3_rest_controller().However, eventually this is planned to be the part of the WebSetup.
     91* We want to be able to direct the message to the appropriate module to handle the data.This could be done either by launching a real REST request or else simulating one via the API.
     92{{{
     93resource = s3mgr.define_resource("module", "resourcename")
     94}}}
     95* Messages which are routed to a specific resource can be subscribed to by the user.For this purpose,we can use the existing Save Search and Subscription functionality where the user can subscribe to new messages for a specific resource using a resource filter.The msg_log can be made a component for the resources.Now,if it's a component, then when someone opens the resource, messages will be there in a tab.Also, if the message has to be tied to multiple resources, then we can use a relationship (link) table.
     96* Implementing/extending  the utility for other modules especially the IRS module will be of real use, where enabling to log reports through SMS will be vital, which can also use the OpenGeoSMS encoding standards(LatLon generates a google-maps URL) for integration with our Android Client. A dedicated routine to generate OpenGeoSMS URLs already exists in prepare_opengeosms() in s3msg.py itself. So integration with the parsing routine won’t be difficult. Other modules for which this can be implemented are : ‘Request’  and ’Inventory’.
     97* Finally the code will be tested on the system and the bugs (if any ;-) ) will be fixed.
     98
    15899Future Options:
    159 [[BR]]
    160 *Though the parsing rules will be generic , a few minor tweaks for other processes such as Email and Twitter will have to be performed to maintain its generic nature.
    161 [[BR]]
    162 *One of the most valuable functionality that can be added here is to make the SMS communication more interactive. e.g. the text body received does not match any of the expected  keywords , the API dispatches a reply stating the expected format of the message.
    163 [[BR]]
    164 *Adapting the parsing rules to cover as wide a base of inbound messages as possible. This will involve making a wider collection of keywords to be searched for every concerned module.Linking different labels  across the DB to module-specific keywords will be really helpful.Also the list of primary keywords to be matched can also be made a deployment-specific option.
    165 [[BR]]
    166  
     100* Though the parsing rules will be generic , a few minor tweaks for other processes such as Email and Twitter will have to be performed to maintain its generic nature.
     101* One of the most valuable functionality that can be added here is to make the SMS communication more interactive. e.g. the text body received does not match any of the expected  keywords , the API dispatches a reply stating the expected format of the message.
     102* Adapting the parsing rules to cover as wide a base of inbound messages as possible. This will involve making a wider collection of keywords to be searched for every concerned module.Linking different labels  across the DB to module-specific keywords will be really helpful.Also the list of primary keywords to be matched can also be made a deployment-specific option.
    167103 
    168104Relevant Experience:
    169 [[BR]]
    170  
    171 *I have developed a thorough understanding of the existing parsing routine in the application. Also, I am comfortable using various parsing generators. I have discussed many of the ideas in the proposal with the mentors and rest of the community.
    172 [[BR]]
    173 *My experience with the Sahana community has been very enjoyable so far, Sahanathon being one of the highlights where I got the opportunity to demonstrate my ability to work with the code and contribute to Sahana Eden. My notable contributions so far have been listed below:
    174 [[BR]]
    175 *I solved the bug #1132 in the Trac (http://eden.sahanafoundation.org/ticket/1132) which was merged during the Sahanathon itself. Pull request: https://github.com/flavour/eden/pull/31
    176 [[BR]]
    177 *Reported and fixed a defect with the update (“Open”) button in the saved searches table.
     105* I have developed a thorough understanding of the existing parsing routine in the application. Also, I am comfortable using various parsing generators. I have discussed many of the ideas in the proposal with the mentors and rest of the community.
     106* My experience with the Sahana community has been very enjoyable so far, Sahanathon being one of the highlights where I got the opportunity to demonstrate my ability to work with the code and contribute to Sahana Eden. My notable contributions so far have been listed below:
     107* I solved the bug #1132 in the Trac (http://eden.sahanafoundation.org/ticket/1132) which was merged during the Sahanathon itself. Pull request: https://github.com/flavour/eden/pull/31
     108* Reported and fixed a defect with the update (“Open”) button in the saved searches table.
    178109Made milestones in the project task workflow a deployment-specific option.  (See https://github.com/flavour/eden/pull/35 ).
    179 [[BR]]
    180 *Fixed email_settings() in the msg controller and required changes in the menu. (See https://github.com/flavour/eden/pull/42 ).
    181  
    182  
    183 [[BR]]
    184 [[BR]]
    185  
    186  
     110* Fixed email_settings() in the msg controller and required changes in the menu. (See https://github.com/flavour/eden/pull/42 ).
    187111 
    188112=== 5. Project Goals and Timeline ===
    189113 
    190  
    191 [[BR]]
    192  
    193114Work Already Undertaken:
    194  [[BR]]
    195 
    196 *Currently parsing is implemented by the parse_message() method in the s3msg.py module, though its usage is limited or rather unimplemented as of now.
    197 [[BR]]
    198 *Also, the current method is hard-coded and inefficeient to handle different processes.
    199 [[BR]]
    200 *A dedicated data model has been developed with consent of the mentors.The msg_workflow has been defined exhaustively in the implementation details and also in the linked gdoc.
    201 [[BR]]
    202 *Mechanism to route messages to resources has also been designed and discussed.
    203 [[BR]]
     115* Currently parsing is implemented by the parse_message() method in the s3msg.py module, though its usage is limited or rather unimplemented as of now.
     116* Also, the current method is hard-coded and inefficeient to handle different processes.
     117* A dedicated data model has been developed with consent of the mentors.The msg_workflow has been defined exhaustively in the implementation details and also in the linked gdoc.
     118* Mechanism to route messages to resources has also been designed and discussed.
    204119 
    205120First trimester:
    206 [[BR]]
    207 
    208121
    209122Due Date -SMART Goal-Measure
     
    231144-Code committed locally.
    232145[[BR]]
    233  
    234  
    235  
    236  
    237  
    238  
     146
    239147[[BR]]
    240148Second Trimester:
     
    33224015th July-
    333241[[BR]]
    334 Integration with prepopulate folders.
     242Integration with templates folders.
    335243[[BR]]
    336244Development of post-install UI (CRUD interface admin panel).