Radiology is the second most used healthcare service, used by over 51% of the workforce annually. Despite the critical role of radiology in healthcare, the process for undergoing a medical imaging exam has remained unchanged for decades. OneImaging is solving this with a concierge approach and a premium-quality radiology network of over 4,000 vetted providers across 48 states, which also reduces imaging costs by 60-80%. Our solution helps patients and families access essential radiology services at fair prices and without surprise bills, all while delivering immediate savings and ROI for employers and payers on every exam.
What you'll do:
Design, build, and maintain end-to-end data pipelines using Databricks (SparkSQL, PySpark) for data ingestion, transformation, and processing.
Integrate data from various structured and unstructured sources, including medical imaging systems, EMRs, Change-Data-Capture from SQL Databases, and external APIs.
Collaborate with the analytics team to create, optimize, and maintain dashboards in Looker.
Implement best practices in data modeling and visualization for operational efficiency.
Deploy and manage cloud-based solutions on AWS (e.g., S3, EMR, Lambda, EC2) to ensure scalability, availability, and cost-efficiency using IaC tooling (Terraform and Databricks Asset Bundles).
Develop and maintain CI/CD pipelines for data-related services and applications.
Oversee MongoDB and PostgreSQL databases, including schema design, indexing, and performance tuning.
Ensure data integrity, availability, and optimized querying for both transactional and analytical workloads.
Adhere to healthcare compliance requirements (e.g., HIPAA) and best practices for data privacy and security.
Implement error handling, logging, and monitoring frameworks to ensure reliability and transparency.
Implement data governance frameworks to maintain data integrity and confidentiality.
Work cross-functionally with data scientists, product managers, and other engineering teams to gather requirements and define data workflows.
Document data pipelines, system architecture, and processes for internal and external stakeholders.
About you:
3+ years of professional experience in data engineering or a similar role.
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Proven expertise in building large-scale data pipelines with Databricks (Spark).
Experience in creating dashboards, data models, and self-service analytics solutions with Looker.
Proficient with AWS core services like S3, EMR, Lambda, IAM, EC2, etc.
Demonstrated ability to design schemas, optimize queries, and manage high-volume databases with MongoDB and PostgreSQL.
Strong SQL skills, plus familiarity with Python, Scala, or Java for data-related tasks.
Excellent communication and team collaboration abilities.
Strong problem-solving aptitude and analytical thinking.
Detail-oriented, with a focus on delivering reliable, high-quality solutions.
Preferred experience in healthcare or imaging (e.g., DICOM, HL7/FHIR).
Familiarity with DevOps tools (Docker, Kubernetes, Terraform) and CI/CD pipelines.
Knowledge of machine learning workflows and MLOps practices.
Please apply through the job application form on the OneImaging website or Greenhouse job board link provided.