Data Warehouse specialist

Sök bland 34 lediga jobb som Data Warehouse specialist och börja ditt nya yrkesliv idag!

Machine Learning Engineer AI, Analytics & Data

Company Description H&M is on a journey to meet and exceed our customers' expectations today and tomorrow. Through collaboration, innovation, and technology we challenge ourselves and the industry. To cater to the individual needs and desires of our millions of customers, our tech organisation delivers solutions for the entire value chain for all our brands. We are accelerating digitalisation and to stay relevant, we need to ensure we have strong leaders in place to bring our best capabilities, innovation ideas and talented technologists to support the transformation of H&M. We take pride in our history of making fashion accessible to everyone and our ambition for tomorrow is to make fashion even more sustainable, inclusive, and welcoming. If you want to make an impact as a Machine Learning Engineer on a global scale, then this is the opportunity for you! Job Description We are now looking for Machine Learning Engineers to bring in key competencies to our Tech Center AI, Analytics & Data - part of Business Tech organisation. You will join our Machine Learning Area, where our vision is to make the H&M, the industry leader in applied AI with scalable and integrated solutions covering the entire value chain. Are you passionate about creating world-class AI solutions? You will play an important role in setting the foundation and shaping our future organisation, while continuously learning and growing. Furthermore, you will: Collaborate with Data Scientists and Data Engineers, to develop machine learning software products. This includes exploring large datasets, experimenting with new algorithms, feature engineering, testing and evaluating model output, deploying solutions for production usage, and scaling them across H&M’s comprehensive fashion network. Design, develop, and maintain the large-scale data and cloud infrastructure required for machine learning projects. Utilize your understanding of software architecture and design patterns to write scalable, maintainable, well-designed, and future-proof code. Develop tools, frameworks, and components to address common needs in machine learning projects, such as model training, serving, monitoring, versioning, explainability, and feature reuse, A/B testing, infrastructure, security, etc. Leverage your expertise in working with GCP and Vertex AI as a foundation for scaling AI and ML enterprise development. Work in cross-functional agile teams of highly skilled machine learning engineers, data engineers, data scientists, and business stakeholders, to scale and build the AI ecosystem within H&M. Qualifications We are on a journey scaling AI development and changing fashion retail industry - we will continue to test, fail and learn. You are an important player in this transformation; therefore, we believe you are curious and eager to learn and spread your knowledge, are a true team player, have a positive mindset and embrace change. We believe that you: Have a BSc or MSc degree in computer science, engineering or related field, or equivalent practical experience. Have 3+ years’ professional experience in working in a relevant role for a Machine Learning Engineer. Are a hands-on person who loves coding, and you like applying software engineering practices to machine learning projects. Have experience in developing software products that have been successfully deployed to production. Have several years of coding experience in modern programming languages, such as, Python, Java, Go, etc. Are familiar with cloud technologies (preferably Google Cloud). In addition, we appreciate that you: Have solid experience in MLOps practices, developing ML pipelines, and deploying ML applications to production. Possesses a strong working knowledge of a variety of machine learning techniques, including regression, clustering, decision tree, probability models, neural networks, etc. You should also have extensive experience working with different ML frameworks. Have strong background in Python programming and great hands-on experience in GCP and Vertex AI. Have experience handling high volume heterogeneous data (both batch and stream) and a solid understanding of data structures, databases, and data storage technologies. Have had exposure to scalable, highly available, fault tolerant, and secure system design and implementation. Embrace the DevOps principle and possess hands-on experience with modern DevOps practices. Have experience in software product lifecycle management, including designing, prototyping, implementing, testing, packaging, deployment, integration, maintenance, etc. Are familiar with agile ways of working, team collaboration, data-driven development, reliable and responsible experimentation. Additional Information Working with tech at H&M Shaping the future of fashion with people, data, and tech. The fashion and retail industries are going through a transformation, driven by customers' technology and sustainability expectations. At H&M Group, we want to shape the future of fashion and lifestyle by harnessing the power of smart tech and data. With our 74-year history of innovation, we understand the need to collaborate and co-create with engineers and tech specialists around the world to achieve our vision. What we offer! You are joining a unique value-driven culture, a large tech network and community where you can be yourself. Besides the obvious perks such as staff discount card, flexible work life, learning communities, wellness benefits, parental benefits etc. There are endless opportunities to experiment and grow in any direction that you want, and when you grow, we grow. Being a major player gives us countless opportunities to make a real impact and shape the future. We are committed to create an inclusive & diverse workplace with a culture that is dynamic and innovative. Do you think we are a match? We hope so! This is a fulltime position with placement in Stockholm.

16 maj 2024
Sista ansökan:
30 maj 2024
Product Data Analyst to H&M Value Stream Activate

Company Description Sales is a global function within the H&M brand with the mission of growing and engaging our customer base, as well as growing and developing our sales channels. By holding the responsibility for our regions and bringing together expertise from both business and digital & tech perspectives, Sales plays a key role in delivering on our Brand Plan - to elevate H&M through product, experience, and brand providing aspirational fashion at an unbeatable price. The mission of the Customer Activation unit in Sales is to create a strategic plan that brings our brand direction to life through different touch points. Through data and customer insights, we will design a plan that will ensure a holistic customer experience from communication with our customer to our membership program. Job Description In the role you will belong to the Product Data Analytics team and will be working within the product teams of the Value Stream Activate. Our Value Stream is developing and delivering a comprehensive range of products and features that cater to various stages of the customer activation journey. We enable tech solutions for our CRM and membership program, as well as across the media channels (Google, Meta, Snapchat, TikTok, etc.) By leveraging innovative technology, data-driven insights, and a deep understanding of customer behavior, we strive to create a seamless customer journey throughout the funnel as well as provide our regional teams and relevant business stakeholders with the tools they need to effectively engage, retain, and maximize the value of our customers. As Product Data Analyst you will be key in driving data-driven product development. As an essential member of the product team you will support in shaping product roadmaps, enable understanding of the value delivered by the product team, will set and monitor KPIs for measuring success, will provide insights supporting strategic decisions. We believe that you have strong leadership to drive analytical agenda within the cross-functional team as well as across a vast number of stakeholders (global commercial teams, analytical community, business tech teams, possibly external partners). In this role you will: Be responsible for understanding and communicating the value product team delivers towards our customers and our business. Support in creating business cases that would inform our product development roadmaps and make sure we prioritize the right initiatives, as well as follow up on how the value is materialized once the development has been delivered. Translate the analysis performed into actionable insights and effectively advocate the insights across various stakeholders within and outside the Value Stream. Use SQL, SAS, Google BigQuery, Google Looker, PowerBI and other analytical tools to dive deep into our data, as well as be an expert within data relevant for your product(s). You will also have an opportunity to drive requirements towards the teams building our data products. Advocate for data-driven product development by defining success metrics (OKRs and KPIs), driving a/b test agenda, setting tracking framework for the newly developed features to ensure we are collecting high quality data. Participate in the wider analytical community within Customer Activation, Sales and broadly H&M Group to grow our collective knowledge and skills. Qualifications 2+ years of experience within analytics preferably as a data analyst in an agile product team setting. Academic degree within Business, Economics, Statistics, Engineering (or similar). Proficiency in SQL and experience within handling large datasets across multiple data sources. Experience from working with customer data and preferably strong competence in SAS. Ability to synthesize data into clear insights and communicate complex problems in a simple way across different types of audiences (both technical and non-technical) in a matrix organisation. Analytical mindset and ability to lead with data-driven insights within a cross-functional team. Preferred to have experience with GCP (BigQuery) or Databricks and from working with data visualisation tools (Looker Studio, Looker, PowerBI, etc.) It is beneficial if you have experience from working with loyalty or membership programs. Preferred to have experience with web analytics data, a/b testing methodologies and tracking development. Fluent in English (written and spoken). Additional information This is a fulltime position with placement in Stockholm. If your experience, skills, and ambitions are right for this role, please apply with CV in English as soon as possible. Please do not send applications to individual email address, due to GDPR, we only accept applications through our career page. We look forward to receiving your application! H&M Group is committed to creating a diverse & inclusive environment and we are actively looking for qualified candidates irrespective of race, gender, gender identity, sexual orientation, ethnicity, religion, national origin, disability or age.

16 maj 2024
Sista ansökan:
28 maj 2024
Data Engineer for Ikano Bank

Are you passionate about data and technology? Are you a motivated and analytical person seeking an exciting and rewarding opening in a fast-paced environment? Have you worked with Data Engineering and AWS? If you're looking for change - we're looking for you! Ikano Bank’s Digital organization is on an exciting transformation journey to advance our mindsets, ways of working, and solutions. Our vision is to establish user-friendly digital capabilities for the many people by unlocking the full potential of our co-workers and partners. Let’s achieve great things together! We offer you As a Data Engineer you will strengthen our Data & Analytics function and you will have a great platform to expand your career and data knowledge. You will work in a diverse and friendly team in a fast-paced environment, acting as a subject matter and collaborating closely with Scrum Master, Product Owner, Business Intelligence Architect, Data Analyst as well as business stakeholders. This position is more than what meets the eye. You will get the chance to push boundaries and influence your daily work. All as part of a caring and open culture where we live our values, work hard, have fun, and operate with a long-term perspective. Because here opportunities arise, and growth follows. Are you the one for Ikano? Key responsibilities: Analyze and understand raw source data as well as business data needs. Develop and support data pipelines on IKANO Bank’s AWS-based Data Platform Transform designs and requirements into efficient, high-performing, easily maintainable, and reusable code Apply critical thinking skills, identify trends and root causes of relatively complex problems Automate as much as possible, from CI/CD pipelines to testing and data quality Participate in the Solution Design process and pro-actively recommend solutions As we strive to work according to DevOps practices, you will share the responsibility with the rest of the team to make sure that our solutions are developed, tested, deployed, and run well Collaborate with, help and support team members and other stakeholder Key Qualifications Understand the demand of particular requirements or problems and applies data engineering skills/techniques thereafter Knowledge base acquired from a few years of experience in data technologies used for data transformation in relational databases and big data environments ( such as on S3, Python, IAM, AWS Glue, AWS Athena, Spark, Redshift, Airflow etc) Has experience building Datalake, Datawarehouse or Datamarts in AWS Experience working with complex data sets, building and maintaining data platforms with focus on data integrity, high data quality Experience with building pipelines for batch and streaming sources Excellent English communication skills (written and verbal) Meritorious: Experience in data modeling, particularly for Data Warehouses Knowledge of Infrastructure as code and CI/CD pipelines Experience working in DevOps team Experience collaborating with offshore support team Experience in the technology of our current data platform technologies (Oracle RDBMS and Informatica) Experience and knowledge of agile frameworks, preferably SAFe Financial/Banking experience We want our customers, partners, and co-workers to choose us for what we stand for, what we deliver and how we deliver it. Three basic values guide our work: common sense and simplicity, working together, and daring to be different. Join us now – together we will find a better way This position is a full-time employment, located in Malmö or Nottingham. In this recruitment, we are cooperating with Cabeza. More information about the position can be provided by Recruitment Consultant Malin Schultz: [email protected] We look forward to receiving your application! About us At Ikano, our vision is to create possibilities for better living. We are an international group of companies active within finance, insurance, production, real estate and retail. Ikano Group was established in 1988 and is owned by the Kamprad family. Our mission is to simplify the many people’s lives so they can focus on living it. We do this by working together to create simple and meaningful solutions based on fair terms that bring value to our customers. Find out more about us on www.ikanogroup.com Keywords: Data Engineer Banking, AWS, Financial Services

13 maj 2024
Sista ansökan:
17 juli 2024
Data Engineer to Fagerhult

We are looking for a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will play a crucial role in designing, building, and maintaining data systems using the Microsoft Fabric platform. Your expertise will contribute to transforming raw data into actionable insights, enabling data-driven decision-making across the organization. Digitalization and data analysis span many different areas, and while we have made progress in some, there are significant opportunities for development in others. Traditional data analysis and business intelligence have been our focus for over 15 years, but now we are facing a technological shift. Our goal is to create even better conditions for the organization to leverage information and realize its full potential. Today we have our data warehouse on-prem with Power BI and Excel as user interface. We aim to replace the on-prem solution and move the data into Fabric during the next two years. We will also transfer data from additional data sources, to be able to support our business needs for BI (Business Intelligence) and AI (Artificial Intelligence). Responsibilities Design and Architecture: o Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions. o Create robust data pipelines, ensuring efficient data ingestion, transformation, and storage. o Architect data lakes, warehouses, and other analytics components within Microsoft Fabric. Data Transformation and Modeling: o Continue to develop and support the existing BI (Business Intelligence) platform with traditional data warehousing, Power BI and Excel. o Contribute to the transformation of solutions currently delivered from our own data center to cloud services. o Implement data modeling best practices, optimizing data structures for performance and scalability. o Leverage Microsoft Fabric components such as dataflows, notebooks, and semantic models to transform raw data into valuable assets. o Ensure version control and deployment practices are followed. Exploratory Analytics: o Dive deep into data to uncover patterns, anomalies, and insights. o Use SQL, DAX (Data Analysis Expressions), and Phyton to explore and analyze data effectively. Collaboration: o Work closely with the organization to deliver end-to-end data solutions. o Collaborate with data analysts to create compelling visualizations. Monitor the field o Monitor the field in terms of both technology and information management. · You have a systematic approach and the ability to work independently and to drive both your own and others' work forward. · You have an analytical ability and can easily understand the challenges and needs of the business. · You have a great interest in problem solving and keep yourself updated on the development of new functions and technologies in the area. · You are motivated by developing and streamlining and can find innovative solutions that create value and business benefits. · You see it as a matter of course to always be extremely customer and service-oriented and are perceived by your surroundings as easy-going and communicative in contact with others. · For you, it is a matter of course to turn problems and setbacks into opportunities and development potential. Since we are part of a global group, you need to be able to communicate fluently in English, both orally and in writing. It is desirable that you can also communicate in Swedish, both orally and in writing. As the position may involve travel, it requires that you have a B driving license. Qualifications · Bachelor's degree in computer science, Engineering, or a related field. · Knowledge of cloud platforms (Azure, AWS, or GCP). · Experience: o Proven experience designing and implementing enterprise-scale data analytics solutions. o Strong understanding of data engineering concepts and best practices. Desired skills: · Certification: Microsoft Certified: Fabric Analytics Engineer Associate (Exam DP-600) 1. · Experience with big data technologies (Hadoop, Spark, etc.). · Familiarity with data governance and security practices. · Documented experience in project management or agile development. · ERP system e.g. Jeeves, Dynamics 365. Why join Us? You will have the opportunity to work with innovative technologies, and a collaborative and innovative work environment. If you are passionate about data engineering, thrive in a dynamic environment and want to contribute to impactful projects, we encourage you to apply! Send in your CV and a personal letter where you tell us why you are the perfect candidate for the role of Data Engineer with us. Do not hesitate to send in your application as soon as possible as we work with continuous selection. If you have further questions about the position, you are welcome to contact Åsa Tjernqvist. Om Fagerhult Fagerhult skapar förstklassiga belysningslösningar som förbättrar människors välbefinnande i professionella och offentliga miljöer. Med hållbarhet och smart uppkopplad belysning i centrum fokuserar vi på kontor, skola, hälso- och sjukvård, retail och utomhusapplikationer. Vi eftersträvar ett nära samarbete med kunder och partners på den europeiska marknaden, samt tillhandahåller belysningslösningar globalt – ofta med skräddarsydda lösningar för kunden. Varumärket Fagerhult omfattar både produktbolaget Fagerhults Belysning AB (baserat i Fagerhult, Sverige) och 12 säljbolag med placering runt om i Europa. Baserat på kunskap om ljusets positiva inverkan på människan utvecklar och producerar vi innovativa belysningslösningar som marknadsförs och säljs via Fagerhults säljbolag. Totalt är vi cirka 1 000 personer, och av dem är nära 37 % kvinnor och 63 % män. Fagerhult ingår i koncernen Fagerhult Group, ett av Europas ledande belysningsföretag, med 12 olika varumärken och 4 100 anställda i 27 länder runt om i världen. Läs gärna mer om oss här https://www.fagerhult.com/sv/o...

8 maj 2024
Sista ansökan:
26 maj 2024
Data Engineer till Envoke Talent

Om rollen: Som konsult hos oss jobbar du i uppdrag hos någon av våra kunder inom fastighet och retail. Några av dina vanliga arbetsuppgifter kan vara: Uppdatera befintliga Data Warehouse miljöer till Microsoft Fabric med både strukturerad och ostrukturerad data i realtid. Strukturera upp scenarier med Predictive Analytics i Azure Machine Learning, i nära dialog med verksamheten. Bygga automatiserade BI-rapporter med Copilot. Kort och gott får du arbeta med den senaste tekniken inom Business Intelligence och Data Analytics. Du behöver vara varm i kläderna som konsult då du på egen hand driver dialogen och möten med kunderna i uppdragen. Du behöver kunna/ha: SQL Python Azure Power BI Intresse för Machine Learning och AI En relevant eftergymnasial utbildning Flytande svenska och engelska i tal och skrift Vana att jobba agilt Det är meriterande om du även kan: Copilot Microsoft Fabric C# C++ Om oss: På Envoke Talenterbjuder vidig en stor frihet och flexibilitet i när och varifrån du arbetar, samt spännande uppdrag. Vår kundbas sträcker sig över flera olika branscher och ger dig möjlighet att utforska varierande projekt inom Business Intelligence och Data Analytics. Vi värdesätter personlig utveckling och erbjuder därför onlineutbildningar och coaching (både individuellt och i grupp) -Allt för att stötta din kompetensutveckling både inom tekniken och mjuka värden. Vill du veta mer om hur det är att vara konsult via oss? Kolla in vår artikelserie “Data Lakes på distans” som handlar om ett av våra utvecklingsteam: Del 1-Del 2-Del 3 Om du har läst så här långt är chansen stor att du är rätt person för rollen. Vi ser fram emot din ansökan! Så här ser rekryteringsprocessen ut: Vi ser över hur din ansökan matchar mot rollen du sökt Matchar du profilen? Då är nästa steg en intervju med oss på Envoke Talent Teknisk intervju och eventuellt arbetsprov Rätt match? Grattis till nya jobbet! Om du får besked att det inte var rätt match denna gång så sparar vi gärna dina uppgifter och presenterar andra tjänster som matchar din profil. Urval sker löpande och tjänsten kan komma att tillsättas innan sista ansökningsdag.

7 maj 2024
Sista ansökan:
24 oktober 2024
Senior Data Engineer

Är du en erfaren Data Engineer och vill axla en senior roll hos våra kunder?Perfekt! Vi söker dig som tycker om problemlösning, vill leverera kundvärde genom automatisering och tycker om att samarbeta i team. Du är bekväm att lära ut till andra och är van att vara den som förstår vilket affärsvärde man vill uppnå och kan föreslå en lösning därefter.Vi tror också att du har en gedigen erfarenhet av att jobba i stora organisationer och är bekväm med att navigera i komplexa system. Du är van att prata med olika stakeholders och har förmågan att kommunicera teknik och lösningar med dessa. ERFARENHET5+ års erfarenhet Data engineering /systemutveckling.Design av pipelins, ETL,data warehousing, data lake,data modeling samtevent-driven arkitektur.NÅGRA AV DESSA? PythonSQLHadoop / SparkGCP/Azure/AWSTerraform/Ansible---- Logikfabriken är ett konsultbolag av och för talangfulla, engagerade och roliga personer inom tech.Ambitionen är att bli Sveriges bästa och mest inspirerande arbetsgivare.Tillsammans skapar vi förutsättningar och driver företaget framåt.Att umgås är något vi värdesätter och därför åker vi på två konferenser per år, anordnar talks, afterwork och mycket annat skoj.För utom ovanstående tror vi på schyst ersättning och transparens.Därför har alla på Logikfabriken samma villkor, sätter sin egen lön, pension, föräldraersättning, antal semesterdagar och andra förmåner.Vår modell är flexibel, trygg och ger i de flesta fall en ordentlig löneökning.Här kan du själv se vad du skulle kunna tjäna: Räkna ut din lönVi brinner även för hållbarhet. Därför är Logikfabriken sedan 2017 klimatneutralt och jobbar ständigt för att minska vårt avtryck.Här hittar du hela vårt erbjudande: Logikfabrikens FAQ

30 april 2024
Sista ansökan:
17 oktober 2024
Data Engineer

Data Engineer Podme was founded in 2017 with a vision to enrich everyday life by taking podcast entertainment to its full potential. Podme is your go-to place for quality podcasts, and brings you a fresh selection of curated content, including popular titles found nowhere else. Backed by Schibsted News Media, the largest media group in Scandinavia, we are now on a growth journey demanding strong data capabilities. If you love a great story and want to help shape the future of audio content, join Podme and be part of our exciting journey! We are seeking a Data Engineer to join our team in Stockholm and help us unlock our full potential. In this role, you will utilize your data engineering skills to support our mission. Responsibilities: - Design, build, and maintain our data warehouse, including fact, dimension, and aggregated tables. - Develop and optimize ELT processes for performance, reliability, and scalability. - Implement data security and privacy policies and practices. - Collaborate with cross-functional teams to identify business requirements and create data-driven solutions. - Stay current with the latest technologies and best practices. - Deepen your knowledge of our data by designing and implementing our data modeling. - Take ownership of the key features you deliver. Qualifications: - Expertise in data warehousing, including data ingestion and modeling. - Experience with data historization, particularly Slowly Changing Dimensions (SCD) Type II. - Strong SQL and Python programming skills. - Analytical mindset with strong data skills to analyze information and build robust data pipelines for complex challenges. - Excellent collaboration and communication skills for working with various stakeholders. - Curiosity, adaptability, and a balance between speed and precision. - Understanding of data integrity and privacy policies. Advantages: - Familiarity with Google Cloud Platform (GCP), BigQuery, and ETL tools such as Airflow, dbt, and Apache Beam. - Experience with infrastructure management tools like Terraform and CI/CD pipelines. - Aptitude for A/B testing. Why Join Us? We are a vibrant team dedicated to advancing the world of podcasting. As a Data Engineer, you’ll work alongside passionate data professionals at a growing company shaping the future of audio content. - Dynamic Work Environment: Thrive in a fast-paced, innovative culture where your work fuels our mission to elevate the podcast experience. - Collaborative Team: Contribute to an inclusive atmosphere centered around knowledge-sharing and teamwork, where your insights are valued. - Impactful Work: Play a vital role in building and maintaining data infrastructure to empower creators in producing high-quality podcasts. - Continuous Learning and Growth: Develop your skills through code reviews, pair programming, and tech talks. Join hack days and other events that foster creativity. - Flexibility and Work-Life Balance: Balance remote work with 2-3 days per week in our Stockholm office. - Wellness and Healthcare: Enjoy comprehensive healthcare insurance and wellness allowances. - Access to Premium Podcasts: Gain insider access to Podme's premium podcasts and learn from the content you help create. Note: Relocation package is not currently available. If you're ready to help shape the future of podcast entertainment and join a passionate team, we'd love to hear from you! Podme is a fast growing nordic podcast platform where you find some of the biggest, most liked and best (according to us) podcasts. What makes us unique is that we give our listeners access to our Premium podcasts which are exclusive and ad-free. Think Netflix, but for podcasts. We launched our service 2017 and today we have a great team of 50+ people that are focused on growth and further innovating the experience for our listeners.

29 april 2024
Sista ansökan:
23 juli 2024
Senior Data Engineer - Analytics Engineering

At Epidemic Sound, we make soundtracking content simple and inspiring with unlimited access to music and sound effects. Headquartered in Stockholm, with offices all over the world, we are a force of over 500 on a mission to soundtrack the world. Our innovative licensing model paves the way for creators - from YouTubers to small businesses to the world’s most recognizable brands such as Netflix, Redbull and BBC - to use sound to enhance their content while simultaneously supporting artists both financially and creatively. Epidemic Sound music is heard 2 billion times a day on YouTube alone and is recognized as one of Europe’s fastest-growing companies by the Financial Times. Epidemic Sound is backed by EQT, Blackstone, Creandum, Atwater Capital to name a few. Join our mission to soundtrack the world! We are now looking for an experienced Data Engineer to join our Analytics Engineering team to help us evolve our data warehouse, data modelling and our data consumer interfaces. The Analytics Engineering team is a part of the Data Platform area - our division dedicated to building our data, infrastructure and insights platform. How you will make an impact Join an organisation with passionate Data Engineers and Data Scientists at an early stage of the company's history. This is a chance to be part of building a data analytics platform that scales worldwide and supports Epidemic Sound to stay ever more data-informed. In this senior role, you will play a pivotal part in evolving our data warehouse, unlocking insights from a diverse array of sources to fuel innovation and growth. Our data stack is built on Google Cloud Platform and part of our solution stack includes BigQuery as a data warehouse, Cloud Composer (Airflow) for orchestration, Snowplow for event tracking, Dataflow for data processing, dbt Core for data transformation, and Looker as our BI layer. You can expect to: Lead and collaborate with a talented and diverse team to build and enhance our data warehouse, core data sets, and data consumer interfaces. Design, build and maintain the Epidemic Sound core data models and guide, encourage, and educate stakeholders on how to use these. Drive data architecture, ensuring adherence to best practices and fostering a culture of excellence in data management. Act as a strategic partner to engineering and analytics teams, delivering high-quality data sets and fostering data-informed decision-making. Mentor and guide junior team members, sharing your expertise and promoting continuous learning and growth. Continuously explore and adopt new technologies and best practices to improve our data architecture and consumption layer as well as team productivity. In order for you to thrive in this role, we believe that you have: Strong proficiency in SQL, dbt and data modelling Proficiency in Python and Git Expertise in cloud-based data warehouses such as BigQuery, along with a strong proficiency in query design aimed at optimising performance within these environments Strong business acumen coupled with a product-oriented mindset You definitely have the following: A passion for teamwork, continuous improvement, and learning A strong interest and curiosity in data Follow software engineering best practices in your work A passion for sharing your knowledge and teaching others what you know The ability to communicate in English with professional working proficiency Nice to have experience: Experience in any of the following tooling: Apache Airflow, Looker, Snowplow, Mixpanel, Hex, OpsGenie Having been part of a scale-up company If you believe you're a strong match for this position, even if you don't meet every requirement perfectly, we encourage you to apply! We value diverse experiences and understand that expertise can manifest in various ways. Equal Opportunity Employer:We believe that bringing people together from different backgrounds, experiences and perspectives makes for a healthy workplace, a more successful business and a better world. We value diversity and encourage everyone to come and soundtrack the world with us. ApplicationDo you want to be a part of our fantastic team? Please apply, in English, by clicking the link below.

26 april 2024
Sista ansökan:
13 oktober 2024
Data Engineer

Vill du hitta den fina kombinationen mellan snabbfotad start-up och etablerat bolag?Välkommen till Knowit Sharp i Malmö. Hos oss välkomnas du av ett drivet och glatt gäng med expertkunnande inom ett flertal områden. Mer spännande än så blir det inte! Om ossIngen kommer ihåg en fegis! Knowit i Malmö har en historia att våga satsa och bana väg för nya möjligheter. Det är inget vi planerar att sluta med. Trots lite tuffare tider satsar vi helhjärtat på Datalytics och vill bygga upp ett erfaret team som vill vara med från början. Hos oss får du möjlighet att ta en del av den där härliga start-up känslan, samtidigt som du blir en del av ett tryggt och etablerat bolag. Vi på Knowit Sharp är ett specialistbolag med 50 anställda som levererar tjänster inom Datalytics, Cloud Services och mobila applikationer. Vi erbjuder våra kunder helhetslösningar där vi levererar fullskaliga digitala projekt, förvaltning och vidareutveckling. Våra uppdrag sker i många olika former, allt från inhouse-projekt från vårt kontor till att sitta ute hos våra kunder i specialistuppdrag. Här får du alltså möjligheten att jobba nära inte bara andra data engineers, utan även cloudarkitekter, apputvecklare och cloudstrateger. Vi tror på samarbete och gemenskap för att utveckla både oss, våra medarbetare och bolaget framåt. Om rollen och digHos Knowit i Malmö får du möjligheten att vara del av en varierande och kreativ vardag. Hos oss finns spännande uppdrag inom olika branscher såsom retail, banking, automotive och industry. Du kommer arbeta med allt ifrån den lilla kunden som precis upptäckt möjligheterna med datadrivet, till den stora kunden som kommit en bit på vägen och behöver specialistkompetens. Du och dina framtida kollegor arbetar ofta med tekniker som: Kodning — Python, SQL Cloudtjänster — Azure, AWS & Google Cloud ETL-pipelines — Databricks, Dagster, DBT Datastorage — Data Lakes, Lakehouse, Snowflake Dataframes/datasets – Notebooks, SQL Spark Det är en gedigen lista, vi vet. Vi förväntar oss inte att du ska ha gjort precis allt - kanske har du specialiserat dig inom något ovan? Räds icke. Då är du förmodligen helt rätt för oss! Har du dessutom börjat snegla lite på Microsoft Fabric ser vi det som väldigt meriterande. Då våra kunder pratar både svenska och engelska så letar vi efter dig som behärskar båda språken i tal och skrift. Vi tror att du åtminstone har fem års erfarenhet av att jobba som Data Engineer. Har du tidigare erfarenhet som konsult ser vi det som meriterande. Vi hoppas att du, precis som vi, gillar att hitta nya sätt att arbeta och gemensamt lösa våra kunders utmaningar. Kort om Knowit i MalmöVi sitter på tre våningar i nyrenoverade lokaler på Stortorget med en fantastisk takterass med utsikt över hela staden och sundet. Vi erbjuder dig en snabbrörlig, rolig och inspirerande arbetsplats där vi löser svåra problem tillsammans genom att vara ett oslagbart team. Vi både stöttar, uppmuntrar och lär av varandra dagligen. Hos oss erbjuds du en trygg anställning med generösa hälso- och friskvårdsbidrag samt pensions- och försäkringsförmåner. Vi jobbar hårt med jämställdhet, miljö och hållbarhet och vi är stolta och glada över att både vara klimatneutrala och det mest jämställda börsnoterade bolaget. AnsökanVi använder analyser i rekryteringsprocessen för att säkerställa en rättvis bedömning av dig som kandidat och ha ett kompetensbaserat urval. Vi arbetar med löpande urval, så varmt välkommen att söka tjänsten redan idag! Har du frågor? Varmt välkommen att kontakta Elin Zahlander (rekryteringsansvarig) på [email protected] eller 073-656 35 75, eller Axel Holtås (VD) på [email protected] eller 070-591 22 56.

12 april 2024
Sista ansökan:
29 september 2024
Erfaren Data Engineer

Hos oss får du arbeta med kollegor och verksamheter som precis som du förstått styrkan i data. Tillsammans med våra kunder skapar vi förutsättningar för bättre beslutsfattande och effektivare organisationer. Tack vare hög efterfrågan på vår kompetens tar vi nu in fler till teamet. Ansök idag och bli vår kollega! Om rollen Som Data Engineer och konsult hos Knowit får du möjlighet att verka inom hela Data och Analytics-kedjan, från strategi, utveckling, databearbetning och analys, till visualisering. Du får arbeta med tekniska lösningar som ligger i framkant som hjälper våra kunder till ökade insikter och datadrivna beslut. Som konsult får du erfarenhet och kunskap av flera olika branscher och verksamheter. Dina huvudsakliga arbetsuppgifter kommer innefatta design, utveckling och underhåll av automatiserade arbetsflöden och pipelines för att hämta, organisera och transformera data från olika källor. I tvärfunktionella team skapar du förutsättningar för dataanalytiker och maskininlärningsingenjörer att bygga stabila analyslösningar genom att se till att data finns på rätt plats, i rätt format och i rätt tid. Om dig Du vill arbeta med tekniken och trivs samtidigt med att kommunicera affärsmöjligheter som ny teknik medför. Du trivs med många kontaktytor och samarbetar gärna med personer från både verksamhet och IT. Som person är du ansvarstagande och uppskattar därför självledarskap. Om du inte varit konsult tidigare, är du nyfiken på att testa denna breda roll. Eftersom våra kunder är både nationella och internationella behöver du kunna svenska och engelska i tal och skrift. Vi söker dig som har minst 5 års relevant erfarenhet och goda kunskaper om: Data pipelines - exempelvis Azure Data factory, DBT, AWS Data pipeline, Microsoft SSIS, SAS DI Studio BI ramverk och arkitektur Kodning inom exempelvis Python och SQL Datalager och datamodellering Databashantering Agila metoder Vi erbjuder … ett team om 45 Data och Analytics-specialister som ser sin arbetsplats som prestigelös, rolig och inspirerande. Hos oss får du en varierande och trygg arbetsmiljö där din kompetensutveckling värdesätts. Vi har kontinuerliga föreläsningar och seminarier där vi delar med oss av vår erfarenhet och vi har en stark gemenskap i form av kontinuerliga träffar för teambuilding, AW, konferensresor och liknande. Ansök redan idag!Kontakta Jesper Senke, ansvarig rekryterare, om du vill veta mer om Knowit och rollen vi kan erbjuda dig. Rekryteringen sker löpande.

11 april 2024
Sista ansökan:
28 september 2024