Magellan Health

Big Data Architect

  • Contract: Full-time

Job Description

This position is the Architecture leader for the Supply Chain of Data program and is a visible leader both internal and external to Magellan. Must be an expert in architecture methods and technologies across multiple data platforms. Shown ability to shape and drive strategic vision. Excel at inspiring, motiving and creating highly productive teams. Expert at attracting, hiring and developing a strong bench of talent. Possesses a unique combination of highly evolved technical skills as well as the ability to envision a strategy and be creative in order to realize ambitious goals on behalf of the business. Inspire the organization to create world-class solutions. Must be passionate about supporting the business and company vision. Ability to communicate a clear point of view from a business perspective in strategic discussions with executive leadership.

Essential Functions:

  • Influence changes and system enhancements to business processes, policies, and infrastructure to deliver the most effective IT services.
  • Manage the selection, evaluation, contracting and integration of externally available hardware, software, and processes to deliver business solutions.
  • Direct and manage, through analysis, the planning, design, development, testing, installation and maintenance of support systems for both internal/external clients.
  • Select and build strong teams through formal training, diverse assignments, coaching, mentoring, and other team development techniques, along with regular individual and group meetings.
  • Develop tactical and strategic plans to satisfy technical business needs and new business proposals.
  • Manage and develop project cost estimates, benefits, justification and assessment of potential project risks.
  • Manage projects, staff, customer expectations and business priorities to achieve customer and business unit satisfaction.
  • Manage, plan and track all budgets and expenses.
  • Oversee vendor relationships and projects.
  • Direct and manage business process re-engineering associated with existing, developing and implementation of systems.
  • Develops and maintains strong, cooperative relationships with other IT services groups. Shares and establishes best practices experiences and opportunities throughout the organization.
  • Apply Agile development practices in delivery of transformational programs as required.

Position Responsibilities

  • Design and implement scalable Big Data architecture solutions
  • Architect, design and implement high performance large volume data integration processes, data provisioning and advanced analytical capabilities.
  • Provide guidance and platform selection advice for common Big Data (Distributed) platforms
  • Design data flow and processing pipelines for ingestion and analysis using modern toolsets such as Spark on Scala, Kafka, Flume, Sqoop, and others.
  • Partner with Data Governance Leader and other Data Architects on designs supporting Enterprise Data Governance.
  • Develop and recommend novel and innovative yet proven and demonstrated approaches to solving business and technical problem using advanced analytics solutions.
  • Design data provisioning structures for ingestion and reporting, specific to use case and technology.
  • Provide data management expertise in evaluating requirements and developing data architecture and refining platform components and design
  • Mentor and guide junior data engineers, DevOps engineers, share best practices, and perform code reviews

Skill Requirements

BA/BS degree or equivalent experience; Computer Science or Math background preferred Over 15 years of engineering and/or software development experience and demonstrable architecture experience in a large organization. Hands-on experience in Big Data Components/Frameworks such as Hadoop, Spark, Storm, HBase, HDFS, Pig, Hive, Scala, Kafka, PyScripts, Unix Shell scripts Experience in architecture and implementation of large and highly complex big data projects Experience of Hadoop and related technologies (Cloudera, Hortonworks, etc.) Experience with data integration and streaming tools used for Hadoop (Spark, Kafka, etc.) Experience in Metadata management, data lineage, data governance, especially as related to Big Data Experience with cloud platforms (AWS/Azure), including readiness, provisioning, security, and governance Experience or understanding of Data Science and related technologies (Python, R, SAS, etc.) Experience or understanding of Artificial Intelligence (AI), Machine Learning (ML), and Applied Statistics History of working successfully with cross-functional engineering teams

About the Company

Magellan is a company on the move. Leading humanity to healthy, vibrant lives.

How to apply

Submit your application using the following link.

External Link »