We are looking for an experienced Data Engineer to join our team. In this role, you will design and implement high-performance data ingestion pipelines from multiple sources, leveraging Apache Spark and/or Azure Databricks. You will also be responsible for delivering and presenting proofs of concept to stakeholders and developing scalable, reusable frameworks for ingesting geospatial data sets. This is an exciting opportunity to contribute to cutting-edge data engineering initiatives in a fast-paced environment. Key Responsibilities: Design and Implement Data Pipelines: Build and optimize high-performance data ingestion pipelines from diverse sources using Apache Spark and/or Azure Databricks, ensuring scalability and reliability. Proofs of Concept: Deliver and present proofs of concept for key technology components, showcasing new approaches and solutions to project stakeholders. Geospatial Data Integration: Develop scalable and reusable frameworks for efficiently ingesting and processing geospatial data sets, supporting both structured and unstructured data formats. Collaborate with Teams: Work closely with cross-functional teams (data scientists, analysts, product owners) to gather requirements and ensure successful integration of data pipelines. Optimize Performance: Continuously monitor and improve the performance, scalability, and cost-efficiency of the data pipelines and ingestion frameworks. Ensure Data Quality: Apply best practices to ensure data quality, security, and integrity across the entire ingestion process. Required Qualifications: Experience: Proven experience in data engineering, specifically in building data pipelines with Apache Spark and/or Azure Databricks. Geospatial Data Expertise: Familiarity with handling geospatial data sets, including their structure and integration techniques. Data Engineering Frameworks: Experience in developing scalable and reusable data frameworks for complex data processing. Programming: Proficiency in Python, Scala, or Java for building data pipelines and processing large data sets. Cloud Technologies: Experience with cloud platforms, especially Azure, and familiarity with data processing tools such as Azure Data Lake, Databricks, or HDInsight. Collaborative Skills: Strong communication skills and experience working in cross-functional teams. Preferred Qualifications: Experience with Apache Hadoop, Kafka, or other big data frameworks. Familiarity with machine learning pipelines or data processing in AI/ML environments. Experience with ETL/ELT processes and data warehousing solutions . Certifications (if applicable): Microsoft Certified: Azure Data Engineer or other relevant certifications in data engineering or cloud technologies. #J-18808-Ljbffr Compunnel, Inc.
...and Change Practices team as a learning designer and facilitator. You'll develop and... ...able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that... ...reports. Key Skills: Data-driven instructional design and adult learning principles....
...Earn Up To $16/hr or More Delivering Food In Chicago, IL (Downtown)! Who Are We? Urban Delivery is a delivery platform that utilizes technology to allow you to independently run food deliveries for multiple restaurants in Downtown Chicago. More restaurants means more...
...Saltwater Brewery in Delray Beach, FL is looking for one, part-time, beer-tender to join our team. We are located on 1701 W Atlantic Ave. Our ideal candidate is attentive, motivated, and hard-working. Responsibilities Serving Customers Opening set up/Closing...
...job description other tasks may be assigned and expected to be performed. Operate fork lift trucks. Remove materials from warehouse racks using lot control and FIFO control as directed. Prepare and place material boxes for Assembler's access. Load conveyors...
...ina Foundation Family of Companies (FOCs) is looking for a Fiber/Cable Technician (OSP & ISP) to support our government customer... ...on experience in infrastructure cabling support for enterprise networks for both Inside and Outside Plant. This role will work as part...