The BMC Innovation Labs brings together customers, partners, and employees to accelerate the development of new and relevant solutions that create value. Our goal with our innovation labs team is to harness new ideas, to anticipate and act on market changes by:
• Fostering innovation by creating spaces for experimentation
• Advancing ideas that generate disruptive technologies
• Accelerating prototyping and development of new capabilities
Within this team we will work at the leading edge of technology to help create some truly innovative and interesting solutions - this could range from working on IoT based projects, through to Augmented Reality, AI/ML, Edge Computing etc.
We are looking BMC Is looking for a Software Engineer (Data/Integrations) to join our growing team of innovation experts. The individual will be responsible for developing integration solutions that shape and define the enterprise integration strategy for BMC products and solutions.
Primary Roles and Responsibilities:
You will understand the challenges that face enterprise IT organizations with integrating workflow automation solutions into enterprise environments to deliver business outcomes and IT services to their end user
You will work with internal teams and customers to understand the needs of the market and to deliver integration solutions to solve customer problems
You will help create data integration pipelines for batch, micro-batch, and real-time data streams. Also you will identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
You will build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and public cloud big data and data warehouse technologies.
You will develop scalable integration platforms, connectors and development tools that work with REST APIs, flat files, and other protocols such as EDI and data formats such as JSON, XML, CSV, and other file formats, as well as the mapping between them
Design and build robust data integrations within our data platform and between the engineering systems to enable touchless data processing.
Must have participated in full product development lifecycle of software products (whiteboard to production) and successfully brought deliverables to market.
Experience with software as a service (SaaS) product development and delivery
Experience working with a modern programming language such as Java / Golang / Node.js / Python
Solid experience in enterprise application integration including SOA, BPM, APIs, Web Services (SOAP/RESTful), Micro Services, Containerization (Open Shift Containers, Kubernetes and etc.)
Experience with using JSON, XML, XSD, and other data payload formats
Advanced working familiarity with a variety of databases (SQL Server, Postgres, Oracle, Cassandra etc.)
Experience building and optimizing ‘big data’ data pipelines, architectures and data sets using various technologies (Hadoop, HDFS, Spark, Kafka, EMR, Snowflake, Redshift, etc.)
Experience with stream-processing systems: Storm, Spark-Streaming, etc.
Experience in Linux and Middleware technologies (MuleSoft, Kafka, Lambda)
Experience with AWS or GCP Cloud
Experience in DevOps stack (CI & CD) and other dependency management and build tools.