BMC is looking for an Integration Engineer to join our growing team of innovation experts. In this role, you’ll be responsible for developing integration solutions that shape and define the enterprise integration strategy for BMC products and solutions.
So, if you love working with customers to help implement and support their integrations, if you’re excited about problem-solving and handling urgent support cases, if you love working in a dynamic environment and multiple interfaces and locations – BMC is the place for you!
Primary Roles and Responsibilities:
Work with internal teams and customers to understand the needs of the market and to deliver integration solutions to solve customer problems
Create integrations for BMC’s market-leading Control-M product into enterprise environments, with a stress on integrations to data operations applications, data-related cloud services, and other applications and cloud services.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and public cloud big data and data warehouse technologies.
Develop, connectors and development tools that work with REST APIs, flat files, and other protocols such as EDI and data formats such as JSON, XML, CSV, and other file formats, as well as the mapping between them
Willing to provide on-call support (as part of an escalation, level 1 is handled by the helpdesk)
4+ years’ experience in a related role – Technical Professional Services Engineer, Technical Presale Engineer, Integration Engineer, Implementation Engineer etc.
Experience with software as a service (SaaS) product development and delivery
Experience working with relational databases, query authoring (SQL & NoSQL) as well as familiarity with a variety of databases (SQL Server, Postgres, Oracle, Cassandra etc.)
Experience building and optimizing ‘big data’ data pipelines, architectures and data sets using various technologies (Hadoop, HDFS, Spark, Kafka, EMR, Snowflake, Redshift, etc.)
Experience with data pipeline and workflow management tools a plus: Airflow, Prefect, Dagster, etc.
Experience in enterprise application integration, including SOA, BPM, APIs, Web Services (SOAP/RESTful), Microservices, Open Shift Containers, Kubernetes etc.
Experience with using JSON, XML, XSD, and other data payload formats
Results-oriented problem solver
Experience in Linux and Middleware technologies (MuleSoft, Kafka, Lambda)
Experience with AWS or GCP Cloud
Experience in DevOps stack (CI & CD) and other dependency management and build tools.
Experience with stream-processing systems: Storm, Spark-Streaming, etc.