About Data at Workiva
We are looking for a Senior Data Engineer in the Data Management & Analytics as we build the next generation of analytics and machine learning for use in the Workiva Platform and for internal teams across the company. Data Engineers at Workiva work with data from inside the company and outside it to provide valuable insights to our customers. Our team will partner with data science and machine learning to support lots of new technology we will be implementing. We are using Amazon Web Services to load batch and realtime data then deliver it when and where it is needed. The Data Engineering team will empower Workiva’s platform and elevate its value to our customers.
The Senior Data Engineer is a critical role that will lead technical projects and provide mentoring to team members to accomplish complex tasks. You will build libraries and distributed services to support multiple data analytics teams and business intelligence engineers reliably and at scale using AWS cloud environments. Provide cutting edge, reliable, and easy to use systems for ingesting and processing data and help the teams that build data intensive applications be successful. This role will lead and collaborate with many cross functional teams across the entire organization on the planning, execution, and successful completion of technical projects with the ultimate purpose of improving customer experience. The Senior Data Engineer will build and maintain batch and real-time data flows used for business intelligence, analytics, and machine learning within all organizations across Workiva, including storing and exposing data via a Database, Data Lake, and other APIs. This role will work primarily with other Data Engineers, but also Data Scientists, ML Engineers, and business partners to ensure quality, reliability and performance at the highest level.
What You’ll Do
- Design and develop data extraction and integration code modules for batch and incremental data flow from various data sources using new and existing patterns
- Use new and existing tools and processes to deploy to integration and production environments. Lead the development and maintenance of the deploy processes.
- Improve the health of the data ecosystem by ensuring complete monitoring of the system, defining alerts on critical failure points and improving data quality by working with data owners and business partners.
- Test software, validate data and write automated tests (unit, integration, functional, etc.).
- Review peer code and submit good & complete feedback based on team standards for correctness.
- Triage and resolve production issues. Communicate the status with all levels of business partners and organizations within Workiva. Escalate as needed.
- Design data lake storage and access patterns to match customer requirements, conform to naming standards and implement industry best practices.
- Understand the data at a deep level and apply security appropriately. Inspect emerging use cases, non standard access patterns and encryption requirements to apply security that may not exist within current security guidelines.
- Tune processes and SQL to reduce cost and wait time. Design systems to balance data volume, latency and customer requirements. Identify what areas of the system need attention.
- Work with business partners to write requirements, understand unwritten requirements and test deployed code.
- Join rotation to support production workflows during off hours.
- Assist the Product Owner in defining the direction of the data product in Workiva.
- Keep up with industry standards in new technology and how it applies to Workiva.
What You’ll Need
- Undergraduate Degree or equivalent combination of education and experience in a related field.
- Bachelor’s degree in Computer Science, Engineering, Math, Finance, Statistics or related discipline
- 4+ years of relevant experience in the data engineering role, including data warehousing and business intelligence tools, techniques and technology, or experience in analytics, business analysis or comparable consumer analytics solutions
- 2-3 years of previous statistics experience preferred
- Experience in big data processing and using databases in a business environment with large-scale, complex datasets. (SQL, Hadoop, Spark, Flink, Beam etc)
- Extensive experience using business intelligence reporting tools. (QuickSight, Tableau, Splunk etc.)
- Extensive knowledge of SQL query design and tuning for performance and accuracy
- Experience with Python, R, or other data relevant scripting languages preferred
- Experience in working with Data Lakes and exposing data for multiple different access needs
- Experience in an Agile/Sprint working environment preferred
- Proficient research skills to locate market information using numerous internal and external sources of data
- Excellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams
- Excellent planning and organizing skills to prioritize numerous projects and ensure data is delivered in an accurate and understandable manner to the end user
- Less than 10%