Transforming Digital World
What are you looking for?
Scroll to top

Database Architect

Job description

Roles and Responsibilities
Job Description: Data Architect

Location: Unanticipated worksite locations throughout U.S. Foreign Equiv. accepted.

Basic Function :

  • The Big Data Architect works closely with the customer and the solutions architect to translate the customer’s business requirements into a Big Data solution.
  • This includes understanding the customer data requirements, platform selection, design of the technical architecture, design of the application and interfaces, and development, testing, and deployment of the proposed solution.
  • Has the ability to design enterprise-grade large-scale data processing systems and help identify the best options for architecture.
  • The Big Data Architect also understands the complexity of data and can design systems and models to handle a different variety of data with varying levels of volume, velocity, and veracity.
  • Should have independently worked on proposing architecture, design, and data ingestion concepts in a consultative model.
  • Leads client assessments, preparing current state and future state architectures along with go-forward recommendations.
  • Will work with the practice leads and account management team to develop statements of work, implementation plans, resource plans, and project estimates

Essential Functions :

  • Has a deep understanding and experience with several of the following: Business Analysis, Requirements Gathering, Data Analysis, Data Modeling, Project Management, Project Estimation
  • Advanced knowledge of design and architecture patterns and methodologies.
  • Demonstrated work ethic, focus, and self-discipline
  • Has and maintains a deep understanding of the role of big data in business and the enterprise.
  • Propose recommended and/or best practices regarding the movement, manipulation, and storage of data in a big data solution including data ingestion, data storage options, query techniques, data variety, volume & velocity,
  • Collaborate with project teams on the platform development process from inception to production deployment, including project scheduling, design, implementation, and coordination with other team members.
  • Collaborate with other technology teams and architects to define and develop cross-function technology stack interactions.
  • Research and experiment with emerging technologies and tools related to big data.
  • Experience in scaling applications on big data platforms to massive size.
  • Performing solution architecture in adherence to enterprise architecture governance.
  • Bridging business and development team.
  • Long-term development and Technical expertise in DW/BI Practice, communicate well with all stakeholders, optimize objectives, leverage state of the art tools and best practices, integrate into corporate systems, and deliver on time.
  • Deep understanding in Data Warehousing, Enterprise Architectures, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract/Transform/Load), Data Analysis, Data Conversion/Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support, etc.
  • Should have independently worked on proposing architecture, design and data ingestion concepts.

Skills :

Technical Skills (Knowledge, Skills & Abilities) :

  • Experience with enterprise data management, Business Intelligence, data integration, and SQL database implementations
  • Experience with the major big data solutions like Hadoop, MapReduce, Hive, Spark, Scala, HBase, MongoDB, Cassandra.
  • Programming/scripting languages like Java, Linux, PHP, Ruby, Python, and/or R. As well as have experience in working with ETL tools such as Informatica, Talend, Pentaho etc.
  • He or she should have experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms.
  • Experience in data migration from relational databases to Hadoop HDFS
  • Propose best practices/standards
  • Translate, load, and present disparate datasets in multiple formats/sources including JSON, XML, etc.

Technical Skills  :

Hadoop stack including HDFS cluster, MapReduce, Hive, Spark, and Impala
Web Technologies CSS, DHTML, XML, Hight Charts, Linux
ETL tools such as Informatica, Talend, and/or Pentaho.
Query: SQL, No SQL Concepts
Ingest : Kafka, Sqoop, Flume
Orchestration: Zookeeper
Databases: Postgres, Mongo DB, Cassandra, HBase
Languages: Java, Scala
Scripting : JavaScript, DHTML, XML, Shell
Good to Have Skills :
Core: AWS, Hadoop, Yarn
Process: Agile-Scrum, Iterative Development, DevOps, CI
Analytics: Descriptive, Predictive (Added advantage)
Tools: Jenkins and TFS
Languages: Python, Java Enterprise

Education Requirements:  Master’s degree or Equlent

Work Experience Requirements :

  • Minimum 1+ years of professional experience in Data Architecture.

Select Minds LLC. seeks Master’s + 1 yr. Exp/Equiv.: Data Architect (SMDA21) Python, Cobol, CICS, Spark, Mongo, DB2, and Oracle. Mail resume with job ID # to HR:1750 E Golf Rd, Suite 395 C, Schaumburg, IL 60173. Unanticipated worksite locations throughout U.S. Foreign Equiv. accepted.

Job Category: Programming & Design Software Developer
Job Type: Full Time
Job Location: Schaumburg USA

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
This website uses cookies to improve your experience. Cookie Policy