8 min read

Retrospecting on 2019

Oh 2019, What a Year!

As 2019 comes to a close, I thought it would be helpful to review, reflect and celebrate professional, personal and educational contributions and accomplishments before leaping into the new year and adventures.

Educating Alicia

Great Books about Data!

What an epic year to read about data and specifically from the viewpoint of women! Early in the year a tweet led me to a data viz website that led me to the preview edition of Data Feminism by Catherine D’Ignazio and Lauren Klein. Chapter Four: Unicorns, Janitors, Ninjas, Wizards, and Rock Stars struck a chord on the danger of solo, non-collaborative data gathering, munging and analysis.

Via the great 99% Invisible podcast, I learned about the book, Invisible Women: Data Bias in a World Designed for Men by Caroline Criado Perez which informs about the shocking lack of data specific to women and the alarming consequences for medical treatments and safety of women from products, cars, and transportation routes to name a few.

Conferences, Courses and the Cloud

As the number 1 fan of Coursera, I discovered an awesome course, Design Thinking for Innovation that resonated with my agile development roots and provided more tools on developing programs iteratively and with a learning mindset. Often times I will just audit courses and watch and listen lectures on the #37 bus into work unless a fun project is available, which led to enrollment and immense enjoyment.

I also had the great pleasure to attend my first RStudio Conference 2019 in lovely Austin where I had the opportunity to learn about all kinds of data communication analysis possibilities along with my colleague, Ryan, chronicled here. In addition to data fun, my hotel was conveniently located to the amazing Cheer Up Charlie’s featuring delicious plant-based fast food from Arlo’s.

Late in the year I took advantage of the Amazon Partner Network online training resources and successfully passed my first exam for AWS Cloud Practitioner certification.

Professional and Personal Contributions in 4 Quarters

One of my favorite books, Illusions, The Adventures of a Reluctant Messiah by Richard Bach includes the quote, You teach best what you most need to learn, a mantra that I hold dear to my heart. This leads me to keep an eye out for educational opportunities with customers and colleagues and immediately applying newly learned things to see how they serve both professional and personal projects to later share lessons learned through blog and brown-bag lunch-and-learns.

Q1 - Connecting with Customers

In addition to regular solutions architecture work, I partnered with some of my favorite clients and colleagues to develop content and labs for Tyler Connect 2019. In addition to getting bonus learning time, it’s great to promote the amazing work being done in government and share it with peers tackling similar problems and challenges in their community and organization.

Meta Analysis and Dashboarding

Paul Alley and I developed dynamic dashboards using Power BI and Shiny that consume multiple APIs to provide insights and overall health of the City of Seattle Open Data catalog. More on our session Visualizing your data, The Power of BI in this post.

Flexing R Dashboards

Justin Baker, my colleague Ryan and I teamed up to teach the basics of R and show how easy it is to create compelling economic reports in R You Ready to Learn R Socrata.

Publishing all things with Python

Roger Sliva and colleagues Ryan and Chris and I shared examples and data engineering use cases at Mesa AZ to Create and Update Data with Python & APIs.

Q2 - Communicating Well with Data

As a data analyst I have spent a lot of time focusing on the cleaning, reshaping, enrichment and visualization of data on dashboards. However, over the last year, my colleagues and I have been consumed performing audits of customer domains to understand their current state, and needed a way to communicate our findings to many audiences. Until proven otherwise, I am all in on using R Markdown to develop the most delightful documents with code.

Word, Reports with Data and Friendly Language for All

While I have not yet implemented parameters in my reports, Mike K Smith’s presentation on The lazy and easily distracted report writer: Using rmarkdown and parameterised reports inspired my first R Markdown generated Word client-facing data analysis reports.

Thematic stories with 3rd Party Data

With the help of my colleague and fellow architect, Alex, we crafted recipes around common topics that governments care about like Economic Development and Education. The following Socrata stories included data for Median Income, Employment Status, Median Home Value, and School Enrollment from the Census and GDP data from the BEA.

Q3 - Building Useful Things

As a Solution Architect, I often get the chance to bridge the gap between existing product features and temporary ones that serve to solve a specific problem as soon as possible for a customer on behalf of the Customer Success team and in collaboration with the Product Development team to ensure road map alignment.

Powering NYC’s annual report with all the Analytics Data

New York City is all in on Open Data and produces an annual report every year on the state of their program. This summer they requested many numbers around their catalog from size of data to utilization by asset type and method of consumption. Due to the urgency of the data need, I created scripts to gather much of this data in support of the 2019 Open Data for All Report.

And again for Federal Insights

Once one does one thing for one customer, inevitably the word gets out, and I reused script to gather similar data for many of our Federal customers before year end (and product analytics release).

Queuing up (and scoring) my next Reads

As an avid reader and a very loyal one to my favorite authors, I often worry about not knowing when the latest installment in a series will be published and fear droughts of new novels. With the demise of my local Seattle Mystery Bookstore a few years ago and my inability to keep up with their zine, I have dabbled first in ruby and now shiny to inform me of new books.

For both fun and desperate need, I created a backlog on github to help me develop a shiny app that queries the free google books api querying my favorite authors and also use the tidytext package to score my likelihood of enjoying their latest books based on coziness and continuity of characters. More on mybookq.com that in this post.

Q4 - Unleashing even more Data

One of my primary initiatives has been to help craft reusable scripts to gather valuable 3rd party data for our clients to support their performance programs and break the wheel that my team has continually sought to recreate year after year, client after client, by technology after technology. Luckily, my impatient and very talented colleague Chris packaged and heavily re-factored this code into the remarkable python library autocensus.

However, Census data is just the beginning and at most provides annual estimates for populations big enough to safely estimate. Even more data is available programmatically via APIs. In the coming year I intend to dive deeper into the following sources and their points of greatness and pain, and the many lessons learned.

BEAutiful Economic Analysis API

Understanding the economic contributions of a geography can be gleaned from measures like the Gross Domestic Product and through engagements with San Bernardino county (largest county in the US), I had the opportunity to interact with the BEA API and I must say, that it has been a complete pleasure. Once a free key is obtained, there are no daily limits to it. Great documentation exists and when errors are returned, they include helpful messages about the expected parameters for a particular call. UNPRECENDENTED!

Untangling a BLS Mess

The BLS API provides all kinds of great data at cadences as frequent as a month for many topics and I am thankful for its existence. However, it throttles requests at 500 a day which can be hit quickly if gathering lots of historical data for multiple geographies and surveys. I do appreciate the property latest=true very much.

Making Sense of the Census

I love the Census and admire the work they have done to revamp all the things. As an optimistic coder though I have run into peculiarities downstream of the data building visualizations and discovered things like a lack of an estimate for a given entity and variable and year that returns a value of -666,666,666 and occasionally from year to year a variable code may change in some manner. More to come on this subject!

Merry New Year!

To 2020, cheers all!