**Experienced Full Stack Software Engineer – Big Data Applications Development for arenaflex**
Posted 2026-05-05- *Join arenaflex, a leading entertainment and media company, in shaping the future of Disney's media business. As a Senior Software Engineer on our Big Data Applications team, you will play a critical role in designing and building systems to process data at scale, solving challenging problems in both batch and real-time data processing, and working across software and data disciplines to engineer solutions.**
- *About arenaflex**
arenaflex is a global leader in entertainment, media, and technology, with a rich legacy of creating world-class stories and experiences for every member of the family. From humble beginnings as a cartoon studio in the 1920s to its preeminent name in the entertainment industry today, arenaflex proudly continues its legacy of innovation and creativity. With operations in more than 40 countries, our employees and cast members work together to create entertainment experiences that are both universally and locally cherished.
- *The Big Data Applications Team**
The Big Data Applications team is a segment under the arenaflex Entertainment & ESPN Technology (DEET) organization, responsible for end-to-end development of Disney's world-class consumer-facing products, including streaming platforms Disney+, Hulu, and ESPN+, and digital products & experiences across ESPN, Marvel, Disney Studios, NatGeo, and ABC News. Our team drives innovation at scale for millions of consumers around the world across Apple, Android, Smart TVs, game consoles, and the web, with our platforms powering core experiences like personalization, search, messaging, and data.
- *Job Summary**
We are seeking a highly motivated Senior Software Engineer with a strong technical background to join our Big Data Applications team. As a key member of our team, you will contribute to maintaining, updating, and expanding the existing Data Capture platform, including the Spark data pipelines, while maintaining strict uptime SLAs. You will also extend the functionality of current Data Capture platform offerings, implement the Lakehouse architecture, and architect, design, and code shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across the Data organization.
- *Responsibilities**
- Contribute to maintaining, updating, and expanding the existing Data Capture platform, including the Spark data pipelines, while maintaining strict uptime SLAs
- Extend the functionality of current Data Capture platform offerings, including metadata parsing, extending the metastore API, and building new integrations with APIs both internal and external to the Data organization
- Implement the Lakehouse architecture, working with customers, partners, and stakeholders to shift towards a Lakehouse centric data platform
- Architect, design, and code shared libraries in Scala and Python that abstract complex business logic to allow consistent functionality across the Data organization
- Collaborate with product managers, architects, and other engineers to drive the success of the Data Capture platform
- Lead the developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, partitioning strategies, and more
- Ensure high operational efficiency and quality of the Data Capture platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
- Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team
- Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements
- Maintain detailed documentation of your work and changes to support data quality and data governance requirements
- Provide mentorship and guidance for team members; evangelize the platform, best-practices, data driven decisions; identify new use cases and features and drive adoption
- *Tech Stack**
- Airflow
- Spark
- Databricks
- Delta Lake
- Snowflake
- Scala
- Python
- *Basic Qualifications**
- 7+ years of software engineering experience developing backend applications
- 2+ years of data engineering experience developing large data pipelines
- Strong algorithmic problem-solving expertise
- Strong fundamental Scala and Python programming skills
- Basic understanding of AWS or other cloud provider resources (S3)
- Strong SQL skills and ability to create queries to analyze complex datasets
- Hands-on production environment experience with distributed processing systems such as Spark
- Hands-on production experience with orchestration systems such as Airflow
- Some scripting language experience
- Willingness and ability to learn and pick up new skillsets
- Self-starting problem solver with an eye for detail and excellent analytical and communication skills
- *Preferred Qualifications**
- Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Redshift, Big Query)
- Experience in developing APIs with GraphQL
- Deep Understanding of AWS or other cloud providers as well as infrastructure as code
- Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
- Familiar with Scrum and Agile methodologies
- Master's Degree a plus
- *Required Education**
- Bachelor's Degree in Computer Science, Information Systems or related field or equivalent industry experience
- *Compensation and Benefits**
The hiring range for this position in Seattle is $156,300 - $209,600 per year and in Santa Monica is $149,300 - $200,200 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidate's geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.
- *Why Join arenaflex?**
- Be part of a global leader in entertainment, media, and technology
- Work on cutting-edge projects that shape the future of Disney's media business
- Collaborate with talented engineers and data scientists to drive innovation and creativity
- Enjoy a dynamic and inclusive work environment that values diversity and inclusion
- Take advantage of comprehensive benefits and perks, including medical, financial, and other benefits
- Pursue career growth opportunities and learning benefits to enhance your skills and expertise
- *How to Apply**
If you are a motivated and experienced software engineer with a passion for big data applications, we encourage you to apply for this exciting opportunity. Please submit your resume and a cover letter outlining your experience and qualifications for this role. We look forward to hearing from you!