Datalake/Data Warehouse API Developer
We are seeking a skilled Data Lake/Data Warehouse API Developer with expertise in Java or Scala to join our team. The ideal candidate will be responsible for designing, developing, and maintaining APIs that interface with our data lake or data warehouse systems. You will work closely with data engineers, data scientists, and business stakeholders to deliver efficient and reliable data access solutions. You Will:· Design and develop RESTful APIs to expose data lake or data warehouse contents securely and efficiently.· Collaborate with data engineers to define data access patterns and optimize query performance.· Implement authentication and authorization mechanisms to ensure data security and access control.· Handle data ingestion, transformation, and serialization tasks as required.· Monitor and optimize API performance and scalability for high-throughput and low-latency requirements.· Collaborate with cross-functional teams to understand business requirements and translate them into API features.· Ensure proper error handling, logging, and exception management for robust API reliability.· Maintain API documentation for developers and users.· Stay current with industry best practices and emerging technologies in data integration and API development. Technical Requirements: · Bachelor's degree in Computer Science, Information Technology, or a related field.· Proven experience in designing and developing APIs for data lake or data warehouse environments.· Proficiency in Java, Scala, or a similar programming language.· Strong understanding of RESTful API design principles and best practices.· Experience with authentication and authorization mechanisms (OAuth, JWT, etc.).· Familiarity with data serialization formats like JSON, Avro, and Parquet.· Knowledge of data lake and data warehouse concepts and technologies.· Ability to work in a collaborative, cross-functional team environment.· Excellent problem-solving skills and attention to detail.· Strong communication and documentation skills.· Understanding of cloud platforms (AWS, Azure, GCP) and their Data services (either Databricks, AWS Glue, AWS Athena or GCP Bigquery ) Other Requirements: · Experience with data processing frameworks (e.g., Apache Spark, Hadoop).· Knowledge of containerization (Docker, Kubernetes).· Familiarity with SQL and NoSQL databases.· Previous work on large-scale data integration projects.About Rackspace TechnologyWe are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace TechnologyThough we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.