Camsw0rld Crash Detection 🚀

Motion Tracking & Safety Monitoring for iOS

📱 Download iOS App Camsw0rld App Screenshot

📱 About Camsw0rld Crash Detection

Camsw0rld Crash Detection is a personal safety app built to detect potential vehicle collisions using your phone’s accelerometer and location services. Upon detecting a crash event, the app instantly notifies your emergency contacts via SMS and can deliver push notifications to your device — even when the app is running in the background. This app was designed with driver safety, simplicity, and fast response times in mind.

🛠️ How I Built It

📌 The Problem

Most consumer safety apps rely on cloud services and third-party integrations, and few are tailored to work offline or without server dependencies. I wanted to design an app that worked locally on-device, with the ability to escalate alerts via push notifications and text messages — especially in crash scenarios.

📱 Native iOS Development

  • Framework: SwiftUI for UI, CoreMotion for accelerometer data, CoreLocation for GPS, and UserNotifications for notifications.
  • Real-Time Motion Detection: The app uses the iPhone’s accelerometer to monitor x, y, z-axis forces and classifies movements based on sensitivity thresholds.
  • Permissions: A clean onboarding flow ensures Motion, Location, and Notification permissions before use.

🌐 Push Notifications (Two-Stage System)

  • Stage 1: Local push notifications when a crash is detected in the foreground.
  • Stage 2: Remote push notifications via APNs using a Vapor server, JWT authentication, and .p8 private key for secure delivery.

📡 Emergency SMS Alerts

  • Integrated MessageUI to send emergency SMS with live GPS location to pre-selected contacts.
  • If no contacts are configured, the app prompts the user to call 911.

🖥️ Backend Infrastructure

  • Lightweight Vapor backend deployed on DigitalOcean.
  • API endpoints for device token registration, crash logging, and push notification triggers.

⚙️ Background Processing

  • Used BackgroundTasks framework to allow motion detection while the app runs in the background.
  • Scheduled periodic background monitoring restart logic.

🔍 Debugging & Optimization

  • Integrated Firebase Crashlytics for crash reports.
  • Built debug-friendly modes with accelerometer readouts and notification triggers.

📸 App Store Submission Prep

  • Created polished screenshots with MockUPhone and AppScreenshot.net.
  • Drafted privacy policy, description, and metadata for App Store Connect.

📖 How Core ML Is Integrated

📌 Overview

The Camsw0rld Crash Detection app uses Apple’s Core ML framework to perform real-time collision prediction based on accelerometer data. By leveraging a trained machine learning model, the app classifies detected motion events as potential vehicle collisions and triggers emergency workflows such as local alerts, push notifications, and SMS alerts to emergency contacts.

🧠 How It Works

Process for Determining Acceleration Speeds and Collision Detection

To enhance the safety features in the Camsw0rld Crash Detection app, a robust methodology was implemented to analyze motion data and accurately detect different types of driving behavior, including potential collisions.

Data Collection

Using Core Motion on iOS, real-time sensor data was collected, specifically:

  • xPoint, yPoint, zPoint acceleration components from the device’s accelerometer.
  • Each data entry was timestamped and optionally geotagged to provide contextual awareness.

The data was exported into CSV format (acceleration_data.csv) for further processing and model training.

Defining Acceleration Thresholds

Acceleration magnitude thresholds were defined based on research and industry standards:

  • No Significant Acceleration: < 0.5 m/s²
  • Normal Driving: 0.5 – 1.0 m/s²
  • Aggressive Acceleration: > 1.0 m/s²
  • Hard Braking: < -2.0 m/s²
  • Potential Collision: ≥ 5.0 m/s²

Training an ML Model

A custom Core ML model was trained using labeled CSV data. Entries were tagged as low, moderate, or high intensity based on acceleration values. The model predicts the collision speed category using x, y, and z vectors.

Integration with App

Using Swift, the model was integrated into the app. Based on predictions:

  • "low" → "Low-speed collision detected."
  • "moderate" → "Moderate-speed collision detected."
  • "high" → "High-speed collision detected."

When a collision is detected, the app:

  • Triggers an alert
  • Logs GPS location
  • Sends a remote push notification: “Were you involved in an accident?”

📦 Passing Data to Core ML

A Core ML model — trained offline and bundled inside the app — expects these three values as input. In code:


let outputFromPrediction = prediction.predictCollision(
    xPoint: motionData.xPoint,
    yPoint: motionData.yPoint,
    zPoint: motionData.zPoint
)
                        

If the model predicts a collision based on the input values, the app:

  • Saves motion data to local storage for incident logs.
  • Triggers emergency services workflows.

🤖 The Core ML Model

The Core ML model itself is a supervised classification model, trained on labeled accelerometer data. Labels include:

  • normal driving
  • braking
  • sharp turns
  • crash

It outputs a label (e.g., "crash" or "safe"), which the app uses to trigger alerts.

📬 Acting on Predictions

When the model predicts a collision:

  • A local alert is presented.
  • An APNs push notification can be sent to notify emergency contacts.
  • Optional SMS messages are triggered using MFMessageComposeViewController with an emergency message and device location.

📂 Model Deployment

The trained .mlmodel file was converted into a .mlmodelc compiled Core ML model using Xcode’s build process. It’s bundled in the app resources and loaded at runtime.


let prediction = try? CCDCollisionClassifier(configuration: MLModelConfiguration())
                        

🚦 Why Core ML?

  • On-device inference: No network dependency for crash detection.
  • High performance: Processes real-time accelerometer data without noticeable battery or CPU overhead.
  • Privacy-friendly: No sensitive motion data leaves the device.

📈 Future Improvements

  • Add model retraining support via Create ML using real-world driving data.
  • Incorporate additional sensors (like gyroscope) to improve prediction accuracy.
  • Migrate to Core ML’s newer MLUpdateTask for on-device model updates (if desired).

✅ Summary

This integration allows Camsw0rld Crash Detection to run real-time crash predictions on-device without cloud dependencies, delivering fast, secure, and reliable safety alerts when they're needed most.

✨ Final Thought

This project challenged me to navigate native iOS frameworks, APNs integration, and server-side Swift with Vapor. It reinforced the importance of precise entitlements, clean device token management, and delivering a clear, reliable experience for life-critical apps.

👨🏾‍💻 About Me

I'm a Software Engineer passionate about building reliable, scalable, and intuitive apps. I specialize in creating mobile applications from the ground up — complete with backend integration — that are both high-performing and user-friendly. Over the years, I’ve had the opportunity to work with companies both large and small, contributing to products that people love to use. I've been developing for Apple platforms since iOS 5 and have mastered both UIKit and SwiftUI. Mentoring other engineers in best practices and architectural design has been a meaningful part of my journey — something I take great pride in. To me, being a Senior Engineer isn’t just a title — it’s a mindset and a way of life.

📧 Contact Me