Careers At Laurus

Current job opportunities are posted here as they become available.

Subscribe to our RSS feeds to receive instant updates as new positions become available.

 
 

 

 
 

Data Engineer (Brighton MI Office)

Location: Brighton, MI

Data Engineer

CommonSail Investment Group

Location: Brighton, Michigan

Overview: As a Data Engineer, you will design, build, and maintain the data infrastructure within CommonSail Investment Group. You will work at the intersection of senior housing data, healthcare operations data and modern data engineering — building robust pipelines, maintaining our Snowflake cloud data lake, developing APIs, and partnering with IT and data professionals to ensure the right data reaches the right people at the right time. This role is pivotal in ensuring that CommonSail’s data assets are accurate, secure, accessible, and leveraged effectively to support strategic business initiatives and operational excellence.

Qualifications:

  • Bachelor’s degree in Information Systems, Computer Science, Data Science, or a related field.
  • 3 - 7 years of experience in data engineering, data infrastructure, API development, data integration, and data governance.
  • Strong command of SQL and Python for data transformation, automation, and pipeline development
  • Demonstrated hands-on experience with Snowflake, including performance tuning, clustering, and access control.
  • Proficiency with dbt (data build tool) for transformation layer development, testing, and documentation.
  • Experience building and consuming RESTful APIs, with working knowledge of authentication patterns (OAuth 2.0, API keys).
  • Knowledge of data privacy, compliance, and security standards (e.g., HIPAA, GDPR).
  • Strong communication, problem-solving, and analytical skills.
  • Experience in healthcare, long-term care, or related regulated industries — familiarity with PointClickCare, MatrixCare, or similar EHR/EHR-adjacent systems is a strong plus.

Primary Responsibilities:

  • Data Pipelines: Design, build, and maintain scalable data pipelines for ingestion, transformation, and delivery of healthcare operations data from multiple client systems and third-party sources.
  • Data Architecture & Administration: Architect and administer the organization's Snowflake data warehouse, including database design, role-based access control, query optimization, and cost governance.
  • DBT: Develop and maintain dbt models for data transformation, testing, documentation, and lineage across all data domains (census, financials, clinical, acquisition targets).
  • API Development: Build and maintain RESTful APIs and integration services that connect source systems, internal tools, and analytical platforms.
  • Data Lake Administration: Design and implement data lake strategies for raw data storage, archival, and cost-efficient processing of high-volume datasets.
  • Data Modeling: Create and enforce data modeling standards — dimensional modeling, star/snowflake schemas, and normalized models — across the enterprise data warehouse.
  • Stakeholder Collaboration: Partner with business leaders and technical teams to align data strategy with organizational goals, translating requirements into scalable solutions.
  • Performance Monitoring: Monitor pipeline health, establish alerting and data quality checks, and resolve incidents with urgency and rigor.
  • Data Support: Support business operation engagements with data room analysis, source system evaluation, and integration planning for EHR entities.
  • Data Engineering Practices: Champion data governance, documentation, and best practices across the data engineering function.

Skills:

  • Cloud Data Warehousing (Snowflake): Design, administer, and optimize cloud-based data warehouses, including schema design, performance tuning, cost governance, security, and role-based access control.
  • Data Transformation & Modeling: Develop scalable transformation layers using dbt and SQL, implementing dimensional and normalized models to support analytics and downstream consumption.
  • Programming & Analytics: Use Python and SQL for data transformation, automation, pipeline logic, and analytical problem-solving across batch and event-driven workloads.
  • Data Pipelines & Orchestration (ETL / ELT, Airflow or Similar): Build, schedule, and monitor ETL/ELT pipelines with orchestration tools to ensure timely, accurate, and resilient data processing.
  • Data Ingestion & Integration (Fivetran, Airbyte, APIs): Ingest data from SaaS platforms, databases, and systems using managed connectors and custom integrations, ensuring scalability and data reliability.
  • API Development & Data Services (REST APIs, JSON): Design and consume RESTful APIs to exchange data between systems, using standard formats such as JSON and secure authentication patterns.
  • Cloud Platforms & Supporting Technologies (AWS, Azure, GCP): Leverage major cloud platforms and supporting services for storage, compute, networking, and security to build flexible and scalable data solutions.
  • Data Storage Formats & Lakes (Data Lakes, Parquet, Avro): Implement data lake architectures using efficient, columnar storage formats to support large-scale analytics, archival, and cost-effective querying.
  • Development Practices & Tooling (Git, VS Code, Database Administration): Apply modern development workflows using version control, IDEs, and database administration best practices to maintain high-quality, well-documented data systems.

General Working Conditions: While performing the duties of this job, the employee is required to communicate effectively with others, sit, stand, walk, and use their hands to handle the keyboard, telephone, paper, files, and other equipment and objects. The employee is occasionally required to reach with hands and arms. This position requires the ability to review detailed documents and read computer screens. The employee will occasionally lift and/or move up to 25 pounds. The work environment requires appropriate interaction with others. The noise level in the work environment is moderate. Occasional travel to different locations may be required.

Equal Opportunity Employer

#SPIND

 

 
 

 

 
 

Applicant Tracking System Powered by ClearCompany HRM Applicant Tracking System