Hartford, United States
10 days ago
We continue to build out our software solutions for innovative home health solutions that produce better outcomes and reduce overall costs through partnerships with providers and payors. The Universal Data Hub (UDH) engineering team builds our core Big Data software components. We are looking for a Senior Developer to contribute meaningfully to our UDH team.
Leveraging your experience in Big Data, Stream processing, Real-time and Batch processing, Application Development and Analytics, you will join a product team that is building a new platform. We want you to bring your talent and expertise to drive the change needed within the healthcare industry.
As Senior Developer, UDH, you will have responsibility for independently working and delivering big data software components. You will work within a scrum team and collaborate closely with other developers, PO and QA team members to ensure quality delivery.
- Independently design, develop, execute, deliver and maintain components of large Software products – with focus on streaming jobs, real time CDC and implementing business logic
- Independently contribute to an Agile Software Development team that is comprised of remote onshore and offshore Development members.
- Adheres to and contributes to the defined processes while still being able to deliver efficiently
- Deliver high-quality, secure, scalable and maintainable production software with a predictable and consistent velocity
- Maintain high level of technical documentation
- Actively participate in interactions with Business Owners to understand business requirements and be comfortable at conveying application and technological logic/decisions to non-technical users
- Collaborate with DevOps team to design and develop CI/CD architecture
- Bachelor’s Degree in Computer Science or equivalent work experience required
- Minimum 10+ years’ experience with designing, developing, delivering and maintaining large scalable enterprise systems out of which last 4 must be in Cloudera Hadoop Ecosystem
- Expert as Scala developer and proficient as Java, and Python developer
- Proficient in Big Data, Distributed Stream processing and Message Broker technologies specifically Cloudera Hadoop ecosystem (Yarn, Spark, Hive, Impala, Airflow) and Kafka
- Strong knowledge of API development and RESTful web services
- Knowledge of the distributed, loosely coupled application development using Scala & Java frameworks (Spring Boot, Spring Batch) is highly desired
- Experience with Application Servers (WebSphere, Tomcat, JBoss) is desirable
- Good knowledge of relation database, specifically Oracle (SQL - PL/SQL)
- Ability to build technical designs as well as own and drive completion of committed deliverables with minimal guidance is required
- Experience with full software development lifecycle including design, architecture, development, testing, deployment and maintenance.
- Self-motivated, curious, eager to learn and able to thrive in a fast-paced, remote environment
- Excellent communication to collaborate effectively with remote team members