A Flutter-Firebase mobile application developed as part of our thesis:
"Badbad: Ata Manobo to English Neural Machine Translation-based Mobile Application Using Transformer Model".
- Flutter SDK (version 3.0.0 or later)
- Firebase account (for backend services)
- Android Studio/Xcode (for emulator/device testing)
- Clone the repository git clone https://github.com/your-username/BadBad.git cd BadBad
- Install dependencies flutter pub get
- Run the app flutter run
This study develops Badbad, a neural machine translation (NMT) mobile application that translates Ata Manobo, a low-resource indigenous language of the Philippines, into English using a Transformer model architecture. The research addresses the critical gap in digital language tools for Ata Manobo speakers, who face socioeconomic and educational barriers due to limited English proficiency. The methodology involves collecting and preprocessing a parallel corpus of 16,562 sentence pairs from religious texts, literature, and community contributions, augmented with back-translation to mitigate data scarcity. The model employs a 4-layer Transformer encoder-decoder optimized for low-resource settings and achieves a BLEU score of 12.7 after architectural tuning and data augmentation. Key innovations include SentencePiece Tokenization (SPT) to reduce out-of-vocabulary errors and integration of Optical Character Recognition (OCR) for image-based translation. Manual evaluation by bilingual speakers confirms the model’s adequacy (2.66/5) and fluency (2/5). The study proves the feasibility of NMT for severely low-resource languages and provides a scalable framework for preserving linguistic heritage through technology. Recommendations emphasize expanding datasets with community collaboration, refining model efficiency, and leveraging policy support for indigenous language preservation.