Hey y’all, I know it’s been awhile since I last made a blog post. Things got kinda crazy over winter break and over the course of this semester; life’s kinda unexpected in that way, but I’m back and have a lot to share with y’all!
So this past weekend I attended my 2nd hackathon: HackDFW, up in Dallas! For those that are unfamiliar with what a hackathon is, I wrote another post about the previous one I attended which briefly discussed the purpose of these events. Now, this hackathon had a way different vibe than HackTX did. When my hacking partner, Devin, and I got there, they were blasting EDM. And this pretty much remained to be the case throughout the entire event (yes, even at 8am on Sunday morning). While at this hackathon, we had a lot of project ups and down: hardware issues, sleep deprivation, inaccessibility of libraries….just a lot really. We ended up not being able to submit a hack due to several of these issues (which I’ll talk about more in depth later on in this post).
While in attendance, Devin and I, along with many other attendees, had the unfortunate experience of dealing with spotty internet connection; there were small wifi hubs set up around various sections of the venue, and some were stronger than others. For an event where the participants rely heavily on things like web-based APIs to build data and create apps/hacks, you can see how this would be frustrating.
It was through this collective frustration that Devin and I formed our idea: let’s get information about these various wireless networks scattered about the venue and create a bunch of data sets to look at how the spotty wifi hubs vary in signal strength over time. Then, let’s model these data sets using some stochastic geometry techniques. Specifically, we wanted to use a Boolean model (which I’ll be discussing more in depth in my next post, because I am giving a talk about it later this week!) to look at range of coverage for each individual network.
We didn’t have much of a thought about how this would be beneficial, but we figured that, at least in doing this, the organizers might have some thought about how to fix this issue for next year. If nothing else, it would be a nice visualization tool. This project was going to be a divide and conquer: Devin was going to work on hardware and acquiring data and I was going to code the simulations and use randomly generated data to test the code via Wolfram Language.
Anyway, so to start this project we first approached one of the sponsors who was lending out chipKITs, which are basically supposed to be arduino-like devices, capable of being programmed to do a variety of different tasks. The sponsor noted that the device itself was built for the arduino community, so it could use the same sorts of libraries that arduinos use. We also acquired, on top of the chipKIT uC32, a wifi shield, to gather information about the wireless networks and have the device output a stream of data regarding this information (stuff like wifi info and RSSI levels). Here’s where we ran into some issues: the device required a miniUSB cable…which are not in use anymore, as people have switched to microUSB cables. So, taking this issue to the sponsor, they told us that they would run to Fry’s Electronics to find some; we were playing a waiting game at this point.
The sponsor returns, and informs us that they found three cables there, but they were $11 each, and so they didn’t buy any. Both Devin and I felt that this was not the appropriate behavior of a hackathon sponsor, and were pretty frustrated. Eventually, after a lot of coffee, consideration, and beating our heads against the wall, we decided to stick with this project.
Devin made a run to Best Buy to purchase the cable, and I continued to work on the simulation. He came back and finally! We were getting somewhere! But not so fast: we then ran into the issue of incompatible libraries. But what gives? We were told that arduino libraries were fully supported on these chipKIT devices.
Because of this issue, we were not able to get the data needed into a readable and transferable format so that I could read it into the simulation and model it. I also ran into a couple of issues myself with writing the code for the Boolean model (I got the underlying Poisson point process down, but generating the second distribution of independent marks proved quite a challenge for me at the time; I’m going to tackle it again before my next blog post).
It’s quite unfortunate we weren’t able to submit, but it happens! Hackathons are unpredictable, and it’s this unpredictability that make them a worthwhile learning experience.