Data Warehouse Architect

Work Type: Full Time

Position: Datawarehouse Architect

OJ Commerce (OJC), a rapidly expanding and profitable online retailer, is headquartered in Florida, USA, with a fully-functional office in Chennai, India.

We deliver exceptional value to our customers by harnessing cutting-edge technology, fostering innovation, and establishing strategic brand partnerships to enable a seamless, enjoyable shopping experience featuring high-quality products at unbeatable prices. Our advanced, data-driven system streamlines operations with minimal human intervention.

Our extensive product portfolio encompasses over a million SKUs and more than 2,500 brands across eight primary categories.

With a robust presence on major platforms such as Amazon, Walmart, Wayfair, Home Depot, and eBay, we directly serve consumers in the United States.

As we continue to forge new partner relationships, our flagship website,, has rapidly emerged as a top-performing e-commerce channel, catering to millions of customers annually.

Experience: 12+ relevant experience with a minimum of 5 years with Datawarehouse Role: Individual Contributor Role.

Roles & Responsibilities:

1. Data Architecture: a. Design and Implement Data Platforms and Infrastructure with considerations around Data storage strategy, Security, Latency, Reliability, Scalability and costs. b. Provide though leadership on the Data Management and Data Governance for the organization from a Data Architecture perspective (data granularity, cross application data access design) b. Design and build an Enterprise Data Architecture and ensure that new applications built follow the EDA and that they conform to existing Data Models ( Logical/Physical)

2. Data Modeling and Warehousing : a. Create Relational and NoSQL data models to fit the needs of a diverse set of data consumers ( Data Analysts, Data Scientists and Business Analysts). b. Design and Build databases hosting a variety of enterprise data, both on-prem and Cloud, with robust design principles around Availability, Consistency and Performance c. Design and Implement Data Lakes for Analytics and AI/ML use cases

3. Data Access: a. Design and Build Database query infrastructure for custom Machine Learning applications including online and API based queries b. Design APIs to support Data requirements for Business Intelligence dashboards

4. Data Pipelines: a. Design and Implement ETL pipelines to ingest data into databases from Transactional systems, streaming sensor data, data files etc., b. Design and Implement automated schedules for monitoring data pipelines, data quality and data lineage.

Skills Required:

· Proficient in SQL and Python

· Proficient in using Data Modeling tools ( ER, Logical Data Modeling etc.,)

· Demonstrated expertise in database design across SQL/NoSQL databases(MS Sql server, MySQL,MongoDB, Time Series DBs, Influx DB etc.,)

· Expert Level Azure Practitioner, with strong expertise around workflow engines like data factory, EMR and databricks,

· OLAP cubes and azure analysis services

· Exposure to ML libraries such as SciKit Learn

· Data Visualization tools like Grafana, Redash, or other open source visualization tools

· Experience in writing ETL(s) and related services in cloud like ADF or other open source tools.

· Experience in Stream processing engines like Spark, Kafka, Kinesis, gcp pubsub etc., Traits Expected · Ability to work in a fast–paced, dynamic environment with limited resources

· Problem-solver with a desire to solve complex business problems with innovative & cost–effective solutions

· Excellent analytical and problem-solving skills, with out–of–the–box thinking.

· Candidates with experience in ecommerce / online retail is added value

· Should be comfortable in dealing with senior leadership (internal & external)

· Communication Skills – Excellent verbal and written communication skills with ability to explain ideas very clearly.

· Self–motivated and flexible, with an ability to work independently.

Submit Your Application

You have successfully applied
  • You have errors in applying