2025-03-06 / Publication /ICLR DeLTAAtropDiff: Data-Scarce Atropisomer Generation via Multi-Task Pretrained Classifier-Guided DiffusionData-Scarce Atropisomer Generation via Multi-Task Pretrained Classifier-Guided Diffusion ACCESS →
2025-04-03 / Publication /Journal of Cheminformatics Clc-db: an open-source online database of chiral ligands and catalystsAn open-source online database of chiral ligands and catalysts for asymmetric synthesis. ACCESS →
2025-02-01 / Publication /Artificial Intelligence ChemistryChiralcat: Molecular chirality classification with enhanced spatial representation using learnable queriesMolecular chirality classification with enhanced spatial representation using learnable queries. ACCESS →
2025-01-13 / Publication /Journal of Chemical Information and ModelingMachine Learning for Reaction Performance Prediction in Allylic Substitution Enhanced by Automatic Extraction of a Substrate-Aware DescriptorAutomatic extraction of a substrate-aware descriptor for reaction performance prediction in allylic substitution. ACCESS →
2023-02-20 / Publication /SCIENCE CHINA ChemistrySynergistic Catalysis for Stereocontrol of Prochiral Nucleophiles in Palladium-Catalyzed Asymmetric Allylic SubstitutionAn extensive summary for the development of prochiral nucleophiles control ACCESS →
2022-08-02 / Publication /Angewandte ChemieBimetallic catalysis in stereodivergent synthesisAn extensive summary of applying bimetallic catalysis in stereodivergent synthesis. ACCESS →
2026-02-28 / Publication /arXivRooted Absorbed Prefix Trajectory Balance with Submodular Replay for GFlowNet TrainingA new training objective and diversity-promoting replay strategy for GFlowNets to address mode collapse in molecular generation. ACCESS →
2025-12-01 / Publication /arXivRxnBench: A Multimodal Benchmark for Evaluating Large Language Models on Chemical Reaction Understanding from Scientific LiteratureA multimodal benchmark to evaluate LLMs on chemical reaction understanding from scientific literature. ACCESS →
2023-07-12 / Publication /bioRxivUNI-RNA: UNIVERSAL PRE-TRAINED MODELS REVOLUTIONIZE RNA RESEARCHA large-scaled pre-trained language model achieves SOTA performance on RNA donwstream tasks. ACCESS →