
Profile
- Biomedical data scientist with 10+ years of experience in data harmonization, bioinformatics, and research infrastructure.
- Experienced in leading system migrations and designing large-scale ETL pipelines for clinical, equipment, biospecimen, omics, and imaging data.
- Skilled in applying machine learning to biomedical datasets to uncover insights into disease progression and treatment outcomes.
- Focused on making complex data reliable, accessible, and useful for impactful biomedical research.
Services
- Data Harmonization & ETL – Build pipelines to clean, map, and transform complex biomedical datasets (clinical, biospecimen, omics, imaging) into standardized formats like OMOP, CDISC, HL7/FHIR, following FAIR principles to enable interoperability and large-scale research.
- Data Systems – Design and implement digital infrastructure for research, including specimen management systems, LIMS, HIPAA-compliant data environments, and electronic workflows that replace manual processes.
- Machine Learning & Analytics – Develop predictive models and algorithms to analyze clinical records, neuroimaging, and omic data, helping uncover insights into disease progression and treatment outcomes.
- Documentation & Training – Produce clear documentation, develop data management plans aligned with funder requirements, and deliver training that makes complex data pipelines understandable and usable by both technical teams and biomedical researchers.
Contact
If you’d like to discuss a project or collaboration, use the form below.