This walkthrough will take you through all the steps necessary to send a new type of data stream to GATD:
Make sure you have checked out the two Git repos lab11/gatd, lab11/gatd-lab11, and our SVN repo, shed, which contains private configuration information specific to our instance of GATD under shed/projects/gatd in gatd.config.
You will have to copy our specific gatd.config from shed into gatd-lab11/config so that all of the scripts in the gatd-lab11 repo that communicate with our instance of GATD magically work.
In the (Github) gatd repo, go to the scripts directory. Run
python profile.py --parser PARSER, where
PARSER is going to be the class name of the formatter you will write in a bit (sorry, there is no way to order these instructions without forward references). You can find examples of formatter class names in lab11/gatd-lab11/formatters. The formatter class names follow the general format
deviceNameParser, where the class name should be written in camelCase and end with
Since this is a new parser, the script will ask you if you want to generate a new profile ID for that parser. You sure do.
The script will then generate a profile ID for your data stream. Hang on to that! You'll use that profile ID to identify your data stream when sending packets to GATD.
aubade:scripts meghan$ python profile.py --parser tesselClimateParser
No Profile ID found for that parser.
Create new Profile ID for tesselClimateParser? [n]: y
Added Profile ID oBNeydOsio for tesselClimateParser
The ways in which a client can send data to GATD are determined entirely by the available receivers. The README for the lab11/gatd/receiver subdirectory is super useful, you should read it.
At this point in time, there are three ways for a client to send data to GATD: UDP, TCP, and HTTP POST. The different ports on which the corresponding GATD receivers listen are determined by gatd.config.
The HTTP POST receiver is the easiest to use - it will take the parameters in your URL and convert it into a JSON blob called data, which you will then need to unpack in the formatter, OR you can send JSON in the body which the receiver will pass along, which means your formatter won't have to do anything. If you are using UDP or TCP, you can basically send binary blobs to GATD that you will later unpack and convert into JSON using a formatter. Make sure one of the fields you send is your profile ID. All of the receivers will add a timestamp, source address, and source port information to your packet before they pass it on to the formatter associated with your packet's profile ID.
Once you have written the client, don't start sending data yet! All the packets you send to GATD now will just queue up until you have a formatter in place to process them.
You write the formatter and add it to gatd-lab11/formatters. The easiest way to figure this out is to look at the many formatters already there. If you use HTTP POST, it's all translated into JSON for you by the receiver. Convenient!
Once you write a formatter and check it in to the repo, it should be automatically incorporated into the GATD instance running on inductor. However, this can be a bit tricky. If the automatic integration doesn't work, let Brad know so that he can figure out why and make sure your formatter gets deployed.
Once the formatter is live, you should be able to start streaming. This is a good time to check if it worked.
In (Github) gatd/scripts there's a script called formatter_test.py. You can use this to test if your setup is working end-to-end.
If you have access to shed, another way to test is to go into shed/projects/gatd and add another list_xxx.py script specific to your profile ID and run that.
Something like (Github) gatd/scripts/explorer_edit.py.
Todo, but easy