This role will support the implementation of a unified “Enterprise Data Hub” in a healthcare payer setting. The project entails the implementation of a Hadoop based big data platform and Teradata (MPP) appliance that supports the creation of a unified data layer housing raw & refined data from various domains enabling: lean, faster delivery of data, enhanced data management & engineering, advanced analytical capabilities (for power users and data scientists), maintaining meta data repositories & data lineage, mastering data.
The intent of this corporate initiative is to build a Big Data foundational that can be leveraged to build services on an ongoing basis that support multiple business areas across the organization. The first business area supported by the data hub will be Provider and therefore the ideal candidate should have experience in the Provider domain.
Data Architect is a high exposure role within the company’s Analytics & Data division that will be responsible for the architecture and modeling of all data solutions, identifying gaps in the current data architecture, and defining the future state data models and capabilities necessary to enable Big Data Adoption and Self-Service Business Intelligence. The successful candidate should have experience in the healthcare payer industry, navigating and negotiating data solutions in a complex and matrixed organization, excellent communication and consulting skills, and the ability to operate both at a high level in defining data strategy in line with latest industry trends (MDM, Big Data, etc.) as well as getting deep into the details of designing data models from scratch. This position offers opportunity for rapid growth so preference is for contractor looking to convert to an employee.
1. 6+ years of experience within health care analyzing health plan data (Membership, Claims, Provider, Clinical, etc.), and knowledge of the business processes and supporting metrics/KPI
2. 6+ years of experience architecting and modeling enterprise level solutions supporting Big Data platform implementations in addition to incorporating Master Data Management standards and ability to align data models with guidelines and standards stipulated by the Enterprise Data Governance committee
3. 6+ years of experience interfacing between technical and non-technical areas of an enterprise organization. Specifically interfacing between operational and analytical areas in the business, enterprise architecture, and technical resources developing information management solutions
Extensive knowledge of data modeling concepts and best practices encompassing NoSQL (Key Value/Tabular, JSON/Document DB, and Graph DB), multi-layer data warehouse (Normalized / Inmon) and data mart (Dimensional / Kimball), and providing expert knowledge of when to apply each approach
Experience modeling enterprise level data models for Hadoop in HBase to support Hive Queries
Experience modeling an analytic data warehouse leveraging a MPP in-memory technology such as Teradata with consumption through a BI tool, Cognos preferred.
Experience with data modeling and metadata solutions including ERWin, System Architect, Informatica Enterprise Data Catalog, and leveraging those solutions to produce data dictionaries and enterprise data lineage documentation.
Experience with data governance working with data stewards and business stakeholders to understand the impacts of data model changes, gain consensus on data definition and business rules, and communicating to all consuming systems
Demonstrated ability to lead and mentor data modelers and data analysts
Knowledge of Healthcare transaction standards (834, NCPDP, HL7, etc.)
Knowledge of Industry standard data models including IBM Unified Data Model for Healthcare (UDMH) and/or Teradata Healthcare Data Model, and experience leveraging industry standards and models to establish an enterprise conceptual data model and logical data models
Experience with Master Data Management solutions, Informatica MDM preferred
Experience with data profiling/quality solution, Informatica Data Quality (IDQ)
Ability to work on multiple projects simultaneously and deliver within tight timelines while being flexible in adapting to new roles
Ability to work with multiple areas within organization to get business objectives, data requirements etc.
Ability to interpret data that is not well defined or documented and develop recommendations based on findings
Experience transitioning an organization from a traditional EDW to a modern data platform, including mitigating the impact the data consumers
Experience designing a approach for change data capture from a complex enterprise system such as FACETS
Knowledge of data movement techniques for big data, including batch, CDC, and event streams, leveraging technologies including Informatica BDE, Sqoop, Kafka, Storm, SAS
Ability to read / analyze existing code and be able to propose innovative solutions and modifications
Bachelor’s Degree in related field required. Master's Degree in related field preferred
|Salary||0 to 0|
|Years of Experience ||5+ to 10 years|
|Minimum Education ||-|
|Willingness to Travel||-|
|Hours per week||0|