**Experienced Full Stack Data Engineer – Web & Cloud Application Development at arenaflex**
Posted 2026-05-05- *Join arenaflex, a leading global retailer, and be part of a dynamic team that is revolutionizing the way we shop. As a Full Stack Data Engineer, you will play a critical role in designing and operationalizing data pipelines to drive business insights and inform strategic decisions.**
- *About arenaflex**
arenaflex is a multinational retailer with a presence in 14 countries, employing over 300,000 employees worldwide. We are committed to providing a family-friendly work environment where our employees can thrive and grow. As a testament to our commitment, arenaflex has been ranked 7th in Forbes' "World's Best Bosses" list. Our company culture is built on the principles of respect, integrity, and teamwork, and we are looking for talented individuals who share these values to join our team.
- *Job Summary**
The Full Stack Data Engineer will be responsible for designing and operationalizing data pipelines to make data accessible for business use (BI, Advanced Analytics, Services). This includes data ingestion, data transformation, data quality, data pipeline development, integration, and collaboration with DevOps Engineers during CI/CD. The ideal candidate will have a strong foundation in programming and SQL, with expertise in data storage, visualization, cloud, data warehousing, and data lakes.
- *Key Responsibilities**
- Design and operationalize data pipelines to control analysis and data services
- Collaborate with product owners, engineering, and data stage teams to design, build, test, and automate data pipelines that are relied upon across the organization as the single source of truth
- Work closely with data architects to adjust on data engineering requirements
- Identify, design, and implement internal cycle improvements: automating manual processes, streamlining data delivery
- Develop and maintain optimal data pipeline design
- Develop and execute ETL/ELT processes using Informatica Cloud (IICS)
- Utilize Azure services, such as Azure SQL DW (Neurotransmitter), ADLS, Azure Event Center, Azure Data Factory to improve and accelerate delivery of our data products and services
- Communicate technical ideas to non-technical audiences both in written and verbal form
- Gather large, complex datasets to meet business needs
- Assemble data models with Data Modeler and develop data pipelines to store data in defined data models and designs
- Identify ways to improve data reliability, efficiency, and quality of data management
- Lead ad-hoc data recovery for business reports and dashboards
- Review the integrity of data from various sources
- Manage data warehouse design including introducing and updating software, and maintaining significant documentation
- Monitor data warehouse movement and resource utilization
- Perform peer review for another Data Engineer's work
- Create and operationalize data pipelines to make data accessible for use (BI, Advanced Analytics, Services)
- Work closely with data architects and data/BI architects to design data pipelines and suggest continuous improvement of data storage, data ingestion, data quality, and organization
- *Requirements**
- 3+ years of experience designing and operationalizing data pipelines with large and complex datasets
- 3+ years of hands-on experience with Informatica PowerCenter
- 3+ years of experience in Data Modeling, ETL, and Data Warehousing
- 3+ years of hands-on experience with Informatica IICS
- 3+ years of experience working with Cloud technologies, such as ADLS, Azure Data Factory, Databricks Live Table, Azure Neural Network, Universe DB, and other big data technologies
- Broad experience working with various data sources (SQL, Oracle database, flat files (csv, delimited), Web API, XML)
- High-level SQL skills. Strong understanding of social databases and business intelligence; ability to write complex SQL queries against various data sources
- Strong understanding of database management concepts (data lake, social databases, NoSQL, Chart, data warehousing)
- Ability to work in a high-speed agile development environment
- Flexibility to address business needs including weekends, holidays, and on-call responsibilities on a rotational basis
- Bachelor's degree in Computer Science, Engineering, or equivalent programming/services experience
- Azure certifications
- Experience executing data integration strategies, such as event/message-based integration (Kafka, Azure Event Center), ETL
- Experience with Git/Azure DevOps
- Experience delivering data solutions through agile software development processes
- Familiarity with the retail industry
- Excellent verbal and written communication skills
- Experience working with SAP integration tools including Bodies
- Experience with UC4 Job Scheduler
- *Preferred Qualifications**
- Master's degree in Computer Science, Engineering, or equivalent programming/services experience
- Experience with data visualization tools such as Tableau, Power BI, or D3.js
- Experience with machine learning algorithms and techniques
- Experience with cloud-based data platforms such as Amazon Redshift, Google BigQuery, or Snowflake
- Experience with data governance and data quality frameworks
- Experience with data security and compliance frameworks
- *What We Offer**
- Competitive salary and benefits package
- Opportunity to work with a leading global retailer
- Collaborative and dynamic work environment
- Professional development and growth opportunities
- Recognition and rewards for outstanding performance
- Flexible work arrangements, including remote work options
- Access to cutting-edge technology and tools
- Opportunities to work on high-impact projects and initiatives
- *How to Apply**
If you are a motivated and talented individual who is passionate about data engineering and wants to be part of a dynamic team, please submit your application through our website. We look forward to hearing from you!