About Shoreline AI:
Shoreline AI is an Industrial AI/IoT startup providing a cloud-native, subscription based SAAS solution to optimize asset performance and operational efficiency for industrial powertrains that drive critical operations in Oil & Gas, Manufacturing, and other industries. Shoreline’s product is already deployed at very large enterprise customers that are seeing significant benefits in their operations by avoiding expensive and unplanned downtime, and improving safety of their sites.
Shoreline platform includes an industrial wireless sensor that is easily installed in minutes and captures vibration data along with additional environmental data through its built-in sensors and external ports. The data is captured throughout the data and uploaded to the Shoreline cloud platform that provides AI/ML driven insights to customers along with powerful data visualization tools for human experts.
Shoreline AI has its headquarters in Campbell, California. For more information, please visit https://shorelineai.us/.
About the Role:
We are looking for a Senior Cloud Data Platform Engineer/Architect to play a critical role in building and maintaining the data infrastructure and platform that powers Shoreline AI’s product. The role is based in our US headquarters in our Campbell office in the San Jose/San Francisco Bay Area.
Responsibilities:
• Design, implement and maintain a scalable and secure data lake to handle both structured and semi-structured data, implement flexible data governance, and provide secure access to Data Scientists and Software Developers.
• Design and Build the “Data API” on top of the Data Lake Platform to provide easy programmatic access to Developers to the available Data for processing, analytics, and visualization.
• Create data pipelines to ingest, clean, and transform data from multiple sources.
• Develop a strategy for easy creation and deployment of containerized applications.
• Develop and maintain internal tools and frameworks for data ingestion using Python and SQL.
• Monitor data pipelines and cloud infrastructure for availability, low latency, and data correctness.
• Collaborate cross-functionally to define data models, contracts, schemas, access, and retention policies.
• Embrace software development and deployment best-practices including continuous integration/continuous deployment (CI/CD), employing Infrastructure-as-Code (IaC), automated testing, etc.
• Learn and adapt to new cloud technologies and development best practices
• Maintain a strong customer-first attitude, and ensure that all technical solutions focus on providing customer delight.
• Participate in architecture, design and code reviews and maintain a high-standard of quality, testing, documentation, and compliance with security standards.
Required Skills and Qualifications
• 3+ years of experience in architecting, designing, developing, and implementing cloud solutions on AWS platforms
• Platform-builder mindset through experience defining and building APIs and tools to help other developers be productive
• Demonstrated ability to work with AI based coding tools (e.g., Cursor, Claude Code, Gemini CLI) to accelerate learning, defining architectures and project plans, implementing code and tests
• Deep understanding of SQL and modern data lake architectures (e.g., using Parquet, Iceberg, or Delta Lake)
• Experience working with real-time or batch data ingestion at scale, and designing fault-tolerant ETL/ELT pipelines
• Familiarity with event-driven architectures and messaging systems like Kafka or Kinesis
• Hands-on experience with AWS services including but not limited to: S3, Lambda, API Gateway, Glue, Kinesis, Athena, and RDS
• Excellent collaboration and communication skills and ability to work with remote teams