Job Attributes

Work Schedule/Shift

Not Available

Job ID

5000379070406

Req. No

70027576

Job Title

Big Data Analytics Architect

Region

CHRISTUS System Office

Category

Information Technology

Division

Not Defined

Company

CHRISTUS Health

Travel

1-10%

Facility

CHRISTUS Information Mgmt-79100

Address

919 Hidden Ridge
Irving, TX  75038
US

Type

Full Time

Apply Now Big Data Analytics Architect Job in Irving

POSITION SUMMARY

The Big Data Analytics Architect is responsible for technical architecture of data platforms, solutions and applications to integrate, store, process and analyze wide variety of data sets, from across the Enterprise. The Architect is also responsible for the design, development, testing, and deployment of the proposed solution. This includes an understanding of methodology, design, specifications, programming, delivery, monitoring, and support standards.

Individual must have extensive knowledge of designing and developing data pipelines and delivering advanced analytics, with open source Big Data processing frameworks such as Hadoop version 2 technologies. Must have hands-on experience with Hadoop application administration, configuration management, monitoring, debugging, and performance tuning. Must have proven competency in programming utilizing distributed computing principles.

The Big Data Analytics Architect is also responsible for supporting the business goals and objectives for the Data Management & Analytics Department, the Information Management Organization, and the Organization as a whole.

MAJOR RESPONSIBILITIES
  • Design & Development- Provides technical development expertise for designing, coding, testing, debugging, documenting and supporting all types of applications consistent with the established specifications and business requirements in order to deliver business value.
  • Strategy Execution -- Contributes to the execution of CHRISTUS' overall information systems strategy as it pertains to their vision of the organization in both strategic and tactical plans. Involved in team adoption, execution and integration of strategy to achieve optimal and efficient deliver.
  • System Engineering -- Involved in the evaluation of proposed system acquisitions or solutions development and provides input to the decision-making process relative to compatibility, cost, resource requirements, operations, and maintenance.
  • System Integration -- Integrates software components, subsystems, facilities and services into the existing technical systems environment; assesses impact on other systems, and works with cross functional teams within information management to ensure positive project impact. Installs, configures, and verifies the operation of software components.
  • System Management -- Participates in development of standards, design and implementation of proactive processes to collect and report data and statistics on assigned systems.
  • System Security -- Participates in the research, design, development, and implementation of application, database, and interface security using technologies such as SSL, Public-Key encryption, and Certificates or other emerging security technologies.
POSITION SPECIFIC COMPETENCIES
  • Proficiency in best practices for building Data Lake and analytical platform architectures on Hadoop.
  • Expertise in schema design, developing data models and proven ability to work with complex data is required.
  • Proficiency with Hadoop v2, MapReduce, Spark, HDFS, Python or R.
  • Experience with Hadoop cluster management, with all included services.
  • Experience with data integration with ETL techniques and frameworks, such as Flume.
  • Proficiency with Big Data querying tools, such as Pig, Hive, and Impala.
  • Experience with messaging systems, such as Kafka or MQ.
  • Experience with Big Data Machine Learning toolkits, such as Mahout or SparkML.
  • Knowledge of NoSQL databases, such as HBase, Cassandra, MongoDB.
  • Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming is preferred.
  • Strong scripting / programming background (Unix, Python preferred)
  • Strong SQL experience with the ability to develop, tune and debug complex SQL applications.
  • Experience in creating the capability roadmaps for data platforms.
  • Experience in review of schemas, data models and data architecture for Hadoop and Teradata environments.
  • Good understanding of Lambda Architecture.
  • Solid understanding of BI and analytics landscape, preferable in large-scale development environments.
  • Must have the communication skills and ability to develop and present solutions to all levels of management (including executive levels).
  • Must have demonstrated the ability to solve complex problems with minimal direction.
  • Must be able to interact effectively and patiently with customers especially while under pressure.
  • The ability to work on multiple projects/tasks simultaneously to meet project deadlines for self and others as required.
  • Ability to establish and maintain positive working relationships with other employees.
POSITION QUALIFICATIONS A. Education/Skills
  • Bachelor degree in Computer Science, Engineering, Math or related field OR 15 years of equivalent experience in IT field is required.
B. Experience
  • Minimum of Ten (10) years of experience with design, architecture and development of Enterprise scale data platforms.
  • Minimum of Six (6) years of experience as an Enterprise Solutions Architect.
  • Minimum of Six (6) years of experience developing analytics solutions with large data sets within an OLAP and MPP architecture.
C. Licenses, Registrations, or Certifications
  • Certifications in Hadoop, AWS, Azure or Java are a plus
Apply Now
Not ready to apply? Join our Talent Pool