Saturday, September 12, 2015

Big Data Specialist vacancy in Englewood, Colorado United States

Job Title : Database Administrator/Dev Ops Resource/HBase DBA/Big Data Specialist
Location: Englewood, CO 80112
Interview Process: Phone and then In-Person/SKYPE
Position Summary:
  • This individual will be part of a software development engineering team responsible for analyzing, developing, testing, implementing, documenting, and supporting enhancements and fixes to a large application. 
  • The applications are critical, Tier 1 applications, and are used 24/7. 
  • This resource will be part of a cross functional team working on major resiliency and infrastructure improvements and changes.
Major Duties
  • Ability to think strategically and relate architectural decisions/recommendations to business needs and client culture
  • Knowledge of how to assess the performance of data solutions, how to diagnose performance problems, and tools used to monitor and tune performance
  • Hands-on project and development work, as demanded by the project and client role
  • Interact with Project Manager & Java Developers to understand requirements and scope the systems
  • Partner with Java Developers to create PL/SQL, stored procedures, provide query tuning expertise, and performance tuning for both MySQL and NoSQL environments
  • Support production environments through strong troubleshooting skills to narrow down data problems quickly in a high pressure situation
  • Handle multiple tasks - ensure that tasks are being completed in timely manner with limited direction
  • Participate in team meetings to discuss approaches to current projects
  • Comply with all established procedures and policies of Comcast
  • Punctual, regular, and consistent attendance
  • Other functions that may be assigned
Minimum Requirements
  • Deep understanding of data warehouse approaches, industry standards, and industry best practices
  • Extensive experience with very large data repositories (terabyte scale or larger)
  • High level understanding of Hadoop framework and Hadoop ecosystem
  • Hands on experience with at least one Hadoop distribution
  • Knowledge of MapReduce and MapReduce platforms like Pig or Hive
  • Development experience with Big Data/NoSQL platforms, such as HBase, MongoDB or Apache Cassandra.
  • Experience performance tuning RDMS and NoSQL databases
  • Expert knowledge of SQL and NoSQL tools
  • Deep understanding of Unix platforms and scripting languages such as Perl and/or Python
  • Java experience
  • Excellent interpersonal, team management, facilitation and communication skills; Must be able to communicate effectively at all levels of the client’s organization
  • Experience with the full Systems Development Life Cycle (SDLC)
  • Strong problem solving and troubleshooting skills
  • Works well in a team environment
  • Excellent written and verbal communications skills
Preferred Requirements:
  • Experience with data quality measurement
  • Experience with database replication technologies such as Oracle Streams and/or Golden Gate
  • Experience with no-SQL databases
  • Experience with Pentaho ETL framework
  • Experience with UC4 job control and DB triggers
  • Experience developing Network Monitoring applications
  • Experience with high transaction/throughput processing in a highly available environment
  • “Automate everything” mindset
  • Strong interpersonal skills; capable of writing recommendations and interacting with company personnel on complex topics
  • Demonstrated experience of participating on teams of technical experts in a diverse, fast-paced, 7 x 24 environments
  • Proven experience meeting large deliverable with fixed and aggressive deadlines
Send resume to adelina@jobs-n-jobs.com
Post a Comment