Golden State Foods

  • Business Intelligence Data Engineer

    Location US-CA-Irvine
    Job ID
    Information Technology
  • Overview

    The Data Engineer will be responsible for the scripts and processes required to extract, transform, clean and move data and metadata so they can be loaded into a data warehouse, data mart or operational data store. Reads, analyzes and digests what the company wants to accomplish with its data, and designs the best possible ETL process around those goals. This person will be responsible to gather, collect, store, do batch or real time processing on the data and serve it via an API for open and easy access. The Data Engineer will also be responsible for evaluating the many Big Data tools available, incorporating them into the data platform and educating others on how best to use them.


    • Work with business requirements to identify and understand source data systems; provide resolutions to all data issues and coordinate with data analyst to validate all requirements, perform interviews with all users and developers
    • Map source system data to data warehouse tables
    • Develop and perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications
    • Define and capture metadata and rules associated with ETL processes
    • Adapt ETL processes to accommodate changes in source systems and new business user requirements
    • Collaborate with all developers and business users to gather required data and execute all ETL programs and scripts on systems and implement all data warehouse activities and prepare reports for same
    • Provide support and maintenance on ETL processes and documentation
    • Build partnerships across the application, business, and infrastructure teams
    • Strive to continuously improve the software delivery processes and practice
    • Champion company standards and best practices
    • Accountable for properly following all IT standards, processes and methodologies as applicable including but not limited to Quality Assurance (QA)
    • Other responsibilities and accountabilities may be assigned based on business and organization needs



     Regular travel requirements (20% - 35%)



    Four year college in a quantitative field such as Biostatistics or Computer Science or commensurate work experience


    • MUST HAVE experience with AWS (Redshift, S3, or Tools such as GLUE)
    • MUST HAVE at least three (3) years of industry experience as a Data Engineer or related specialty (e.g. Software Engineer, Business Intelligence Engineer, and Data Scientist) with extensive professional experience and a proven track record in a role focused on understanding, manipulating, processing and extracting value from large disconnected datasets.
    • Strong proficiency in SQL
    • Expert in ETL
    • Creative in finding new solutions and designing innovative methods, systems and processes
    • Desire to create and build new predictive/optimization tools for structuring datasets

    Preferred Qualifications

    • Strong analytic skills for working with unstructured datasets
    • Strong project management and organizational skills, experience working on complex initiatives with cross-functional teams in a dynamic environment
    • Experience with connecting data sets to data visualization tools and creating reports using Tableau and Power BI




    • Business Acumen
    • Business Alignment
    • Requirements Definition


    • Oral and Written Communication Skills
    • Operational Process Flows
    • Analytical skills




    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed