Upgrade your analysis

Addressing the Growing Tuburculosis Crisis
The WHO identified that one of the main challenges in combating TB is the computational resources required for generating, analysing, storing, and managing sequencing output data.
EIT Pathogena’s Bioinformatics Analysis Pipeline for Mycobacterium Tuberculosis and Non Tuberculosis Mycobacteria
Includes species identification, lineage calling, identification of related isolates within the Mycobacterium tuberculosis complex, and resistance prediction for 15 drugs.
Also includes species detection for 190 NTMs.
Fully automated including insight reports and access to intermediate files.
Operates and stores data securely within the Oracle Cloud with world class data privacy and protection.
Supported NGS Technologies: Illumina and Oxford Nanopore Technologies (experimental)
Only EIT Pathogena brings it all together
We partnered with the University of Oxford, globally renowned for their expertise in pathogens, and Oracle, world leaders in cloud data management, to deliver a fully automated analysis platform and cloud data management solution for the analysis of Mycobacteria.
Complete Mycobacterium tuberculosis genome WGS analysis: Comprehensive whole-genome sequencing for Mycobacterium tuberculosis.
Species identification for 190 NTMs: Identifies 190 non-tuberculous mycobacteria (NTM) species with precision.
>95% sensitivity and >97% specificity for RIF and INH: High sensitivity and specificity for Rifampicin (RIF) and Isoniazid (INH), ensuring accurate diagnostics.
Fast and insightful analysis: Accurate analysis of up to 100 Mycobacterium tuberculosis samples per hour
Cloud-based automated bioinformatics pipeline: automated cloud pipeline that enhances both quality and speed of analysis.
How the pipeline works
Upload: The upload process for Gzip FastQ files is simple using our drag and drop interface.
Human read removal: the pipeline is built with data privacy and security at its core. To protect patient information, in addition to avoiding encumbering downstream analyses with off-target sequences, the first step when using the portal is human read removal.
For added security, you also have the option to remove human DNA before leaving your network, when using the command-line interface.