Full Job Description
Data Engineer
Northmarq – Minneapolis, MN (Bloomington, MN)
At Northmarq, you can be part of something special. Northmarq is a unique capital markets resource for commercial real estate investors, providing our clients access to experts in debt, equity, investment sales, loan servicing, and fund management. We offer you a career path with best-in-class training, and we foster inclusive teams committed to collaboration, mentoring, and growth. At Northmarq, we will help you unlock your potential – whether you are an industry veteran or you’re just getting started. Your new career is waiting. Start something special today. Northmarq was voted by Real Estate Forum as one of The Best Places to Work in Commercial Real Estate!
Northmarq is seeking a motivated and detailed Data Engineer to join its intelligence platform team at the Minneapolis, MN (Bloomington, MN) headquarters office. The ideal candidate will have foundational knowledge and experience with Azure Data Factory (ADF), Databricks, Snowflake, Power BI, and Azure Git. This role is perfect for someone looking to grow their career in data engineering and gain hands-on experience with modern data technologies.
*This position is an in-office position, with an immediate start date.
Position Responsibilities:
- Data Pipeline Development:
- Assist in designing, developing, and maintaining data pipelines using Azure Data Factory (ADF) and Databricks.
- Support the processing, transformation, and loading of data to ensure it meets business requirements.
- Data Warehousing:
- Help manage data warehousing solutions using Snowflake.
- Participate in optimizing data storage and retrieval processes.
- Business Intelligence:
- Assist in developing and maintaining Power BI dashboards and reports.
- Collaborate with business analysts and other stakeholders to gather reporting requirements.
- Version Control and Collaboration:
- Use Azure Git for version control and code management.
- Ensure proper documentation and versioning of data-related processes and code.
- Data Quality and Performance:
- Monitor data pipelines and processes to ensure data quality and performance.
- Implement basic data validation and error handling mechanisms.
- Learning and Development:
- Continuously improve technical skills and stay updated with the latest industry trends and technologies.
- Participate in training sessions and mentorship opportunities provided by senior team members.
- Bachelor's degree in Computer Science, Engineering, Information Technology, or related field.
- 2-4 years working with ETL processes, data modeling and data warehousing concepts
- Basic understanding and some experience with Azure Data Factory (ADF), Databricks, Snowflake, Power BI, and Azure Git.
- Knowledge of cloud platforms, ideally Azure.
- Strong analytical and problem-solving skills.
- Good communication and collaboration skills.
- Ability to work independently and as part of a team.
- Internship or project experience in data engineering or related fields.
- Exposure to additional data tools and technologies such as Apache Spark or similar.
- Knowledge of SQL and Python.
Job Information
Job Category:
Information Technology
Spotlight
Employer
Related jobs
The position is described below. If you want to apply, click the Apply Now button at the top or bottom of this page. After you click Apply Now and complete your application, you'll be invited to creat...
Nov 13, 2024
Atlanta, GA
The position is described below. If you want to apply, click the Apply Now button at the top or bottom of this page. After you click Apply Now and complete your application, you'll be invited to creat...
Nov 13, 2024
Charlotte, NC
The position is described below. If you want to apply, click the Apply Now button at the top or bottom of this page. After you click Apply Now and complete your application, you'll be invited to creat...
Nov 13, 2024
Charlotte, NC