Data Cloud Platform
Are you passionate about leveraging data to drive innovation and change? Do you thrive in dynamic environments where cutting-edge technology meets financial expertise? BlackRock, a global leader in investment management, is currently seeking talented individuals to join their Data Cloud Platform team. This opportunity isn’t just about a job—it’s a chance to be part of a transformative journey in the world of finance and technology.
Table of Contents
Why BlackRock?
BlackRock stands at the forefront of the financial industry, managing over $10 trillion in assets and serving clients in more than 100 countries. Beyond its impressive financial footprint, BlackRock prides itself on fostering a culture of innovation and inclusivity. Here, employees are encouraged to push boundaries, challenge the status quo, and drive impact through their work.
Roles and Responsiblities
- Work alongside our systems engineers and UI developers to help design and build scalable, automated CI/CD pipelines.
- Help prove out and productionize infrastructure and tooling to support scalable cloud-based applications
- Working/Unlocking myriad generative AI/ML use cases for Aladdin Data and thus for BlackRock
- Have fun as part of an awesome team
- Working as part of a multi-disciplinary squad to establish our next generation of data pipelines and tools
- Be involved from inception of projects, understanding requirements, designing & developing solutions, and incorporating them into the designs of our platforms
- Mentor team members on technology and best practices
- Build and maintain strong relationships between DataOps Engineering and our Technology teams
- Contribute to the open source community and maintain excellent knowledge of the technical landscape for data & cloud tooling
- Assist in troubleshooting issues, support the operation of production software
- Write technical documentation
- Data Operations and Engineering
- Comfortable reading and writing python code for data acquisition, ETL/ELT
- Experience orchestrating data pipelines with AirFlow and/or Argo Workflows
- Experience implementing and operating telemetry-based monitoring, alerting, and incident response systems. We aim to follow Site Reliability Engineering (SRE) best practices.
- Experience supporting database or datastores e.g. MongoDB, Redis, Cassandra, Ignite, Hadoop, S3, Azure Blob Store; and various messaging and streaming platforms such as NATS or Kafka
- Cloud Native DevOps Platform Engineering
- Knowledge of the Kubernetes (K8s) APIs with a strong focus on stateful workloads
- Templating with Helm, ArgoCD, Ansible, and Terraform
- Understanding of the K8s Operator Pattern — comfort and courage to wade into (predominantly golang based) operator implementation code bases
- Comfortable building atop K8s native frameworks including service mesh (Istio), secrets management (cert-manager, HashiCorp Vault), log management (Splunk), observability (Prometheus, Grafana, AlertManager).
- Experience in creating and evolving CI/CD pipelines with GitLab or Github following GitOps principles
- Natural/Large Language Models (Good to have)
- Experience with NLP coding tasks like tokenization, chunking, tagging, embedding, and indexing supporting subsequent retrieval and enrichment
- Experience with basic prompt engineering, LLM fine tuning, and chatbot implementations in modern python SDKs like langchain and/or transformers
- Looking for candidates with 4+ years of hands-on experience in Data Platform DevOps/Cloud or related Engineering practices.
How to Apply
Ready to embark on a career-defining journey with BlackRock? Visit the BlackRock Careers Page to learn more about the Data Cloud Platform Engineer role and submit your application today. Don’t miss your chance to be part of a team that is shaping the future of finance through data-driven insights and innovation.
For more jobs visit careerflock official page, Data Cloud Platform, Data Cloud Platform, Data Cloud Platform, Data Cloud Platform