
Unlock This Lesson
22
min
publish date
May 27, 2025
duration
22
min
Difficulty
Case details
Tower.dev allows users to develop and debug their data+AI applications locally and use Deepseek small models on their dev machines. Engineers can use local GPUs, save on inference costs and avoid rate throttling. When engineers are ready to deploy these applications to production, the same code can run with serverless inference providers like Hugging Face or Together.AI. Tower.dev provides a managed cloud platform to run data+ai apps in production, and a CLI to seemlessly synchronize development and production environments. Key Takeaways: Run local inference of SOTA models like DeepSeek R1 Get a Tower development and production environments to run data+ai apps like ETL jobs, feature transformations and inference
Share case: