Cloud Data & AI Engineer Job at Osi Digital, Irvine, CA

TGJiMWtDcTRHT0NSdHVXNm5VdVVZU0xiU1E9PQ==
  • Osi Digital
  • Irvine, CA

Job Description


Job Title: Cloud Data & AI Engineer
Employment: Full-time

At OSI Digital Inc, we accelerate our client’s digital transformation journey by delivering modern data solutions, enabling them to unlock the full potential of their data with scalable cloud platforms, intelligent analytics, and AI-driven solutions. With deep expertise across data engineering, cloud platforms, advanced analytics, and AI/ML, our teams bring both technical mastery and business acumen to every engagement. We don’t just implement tools—we build scalable, future-ready solutions that drive measurable outcomes for our clients.

Role Summary:

We are seeking a highly skilled and results-driven Modern Cloud Data and AI Engineer with a strong background in modern cloud data architecture, specifically on Snowflake, and hands-on experience in developing Data solutions in Power BI, implementing AI Solutions.
The ideal candidate combines strong data engineering, integration, and BI expertise with hands-on AI project execution, supporting OSI’s reputation for high-impact consulting in cloud and digital transformation spaces and will be a strong communicator, capable of implementing projects from the ground up.

Key Responsibilities :
  • Lead the design, development, and implementation of highly scalable and secure data warehouse solutions on Snowflake, including schema design, data loading, performance tuning, and optimizing cloud costs.
  • Design and build robust, efficient data pipelines (ETL/ELT) using advanced data engineering techniques. This includes hands-on experience in data integration via direct APIs (REST/SOAP) and working with various integration tools (e.g., Talend, stitch, Fivetran, or native cloud services).
  • Develop and implement high-impact visual analytics and semantic models in Power BI. Apply advanced features such as DAX, Row-Level Security (RLS), and dashboard deployment pipelines.
  • Proficiency in Python/R, familiarity with ML frameworks (scikit-learn, TensorFlow, PyTorch), experience with MLOps concepts, and deploying models into a production environment on cloud platforms.
  • Develop and deploy AI/ML solutions using Python, Snowpark, or cloud-native ML services (AWS Sagemaker, Azure ML).
  • Exposure to LLM/GenAI projects (chatbot implementations, NLP, recommendation systems, anomaly detection) is highly desirable.
  • Implement and manage data solutions utilizing core services on at least one major cloud platform (AWS or Azure).
  • Demonstrate exceptional communication and articulation skills to engage with clients, gather requirements, and lead project delivery from ground up (inception to final deployment).
Required Qualifications:
  • Minimum of 4 years of professional experience in data engineering, consulting, and solution delivery.
  • Bachelor’s degree in computer science, Engineering, or a related technical field. A master’s degree in a relevant field is highly preferred.
  • Strong, hands-on experience in end-to-end Snowflake project implementation. Any professional certifications in snowflake preferred.
  • Expertise in designing, building, and maintaining ELT/ETL pipelines and data workflows, with a solid understanding of data warehousing best practices.
  • Hands-on experience implementing dashboards in Power BI, including DAX and RLS. Professional certifications in Power BI are preferred.
  • Proficiency in Python, with demonstrable experience deploying at least one AI/ML project (e.g., Snowpark, Databricks, SageMaker, Azure ML) including feature engineering, model deployment, and MLOps practices.
  • Experience with machine learning frameworks such as scikit-learn, TensorFlow, or PyTorch, and hands-on exposure to production deployments.
  • Familiarity with projects involving LLM/Generative AI (e.g., chatbots, NLP, recommendation systems, and anomaly detection).
  • Hands-on experience working with cloud platforms, specifically AWS or Azure.
  • Excellent verbal and written communication, presentation, and client-facing consulting skills, with proven track record of successfully leading projects from inception.

Preferred (Added Advantage) Qualifications:
  • Experience with Tableau or other leading BI tools.
  • Working knowledge of Databricks (e.g., Spark, Delta Lake).
  • Experience or strong understanding of Data Science methodologies and statistical modeling.
  • Relevant industry certifications, including Power BI, Snowflake, Databricks and AWS/Azure Data/AI credentials.

Job Tags

Full time,

Similar Jobs

Evo Pest

Entry Level Sales Representative Job at Evo Pest

 ...Entry LevelSummer Sales Representative High Earnings Elite Training Paid Travel Big Incentives Join Evo , one of the Midwest...  ..., e-bikes Grocery credits, seasonal bonuses, and even boat days Why Evo? Smaller team = more direct mentorship + faster... 

Freudenberg Group

Quality Manager Job at Freudenberg Group

 ...Responsibilities: \n \n Maintain and improve the QMS: Manage and continuously improve the QMS to ensure compliance with ISO...  ...protected military or veteran status, or any other characteristic protected by applicable law. Freudenberg Performance Materials LP (USA)

McCaffrey's Food Markets

Coffee Shop Manager Job at McCaffrey's Food Markets

 ...Job Summary - Coffee Bar Manager Friendly personality and "customer service" mentality Carry out all merchandising programs for the department Order merchandise and control inventory Supervise and provide in depth training for department personnel... 

Seva Dental Team

Part-Time Payroll And Accounts Payable Specialist Job at Seva Dental Team

 ...highly organized, detail-oriented professional to support both Payroll and Accounts Payable functions across a growing, multi-entity organization...  ...-time: 2025 hours per week (flexible scheduling). Hybrid or remote considered for the right candidate. Competitive hourly... 

TSMG

Training Manager Job at TSMG

 ...Training Manager Service Measure (SM) is a field data collection company founded in 2013 in New York. We collect data where automation is not possible. We count features, take pictures, make videos, record speech, and scan areas for every detail you need to make more...