Senior DevOps Engineer - Elastic Search (ELK) experience - Whiting House Technologies - Saint Paul, MN   
Experience with DevOps concepts. The Senior DevOps Engineer 1 is also responsible for managing, monitoring and deploying highly available, highly scalable...
From Whiting House Technologies - Tue, 16 May 2017 12:38:23 GMT - View all Saint Paul, MN jobs
          Senior DevOps Engineer - Elastic Search (ELK) experience - Whiting House Technologies - Saint Paul, MN   
Unix/Linix, Microsoft, Oracle, SQL server, MySQL, MongoDB, SSH, web and app technologies (IIS, apache, tomcat, JBoss), VMware, AD and Storage/SAN....
From Whiting House Technologies - Tue, 16 May 2017 12:38:23 GMT - View all Saint Paul, MN jobs
          Developer (ADMIN BIG DATA) - Standard Bank - Constantia Kloof, Gauteng   
Chef Elastic Search/ Logstash/Kibana1-. Standard Bank is a firm believer in technical innovation, to help us guarantee exceptional client service and leading...
From Standard Bank - Thu, 29 Jun 2017 00:06:14 GMT - View all Constantia Kloof, Gauteng jobs
          (USA-MO-St. Louis) Technical Specialist   
*Overview* Thomson Reuters Technical Services is searching for a highly motivated person to fill a Technical Support position. The team is currently made up of highly skilled, energetic individuals that possess strong analytical, technical, and customer service expertise. We work in the fast-paced environment of Market Data where our customers expect timely and accurate resolutions. We are looking for individuals that exude ownership and follow through on commitments to our customers. The qualified candidate will be expected to adapt quickly and should have a strong desire to learn. *Role Purpose:* * Provide specialist technical support for complex issues affecting Thomson Reuters products to customers, internal stakeholders, and 3rd party engineers. * Proactively monitor client site devices and comms. Remotely resolve issues or using our 3rd party partners. Look for trends and implement preventative measures. * Leverage innovative technologies to ensure globally-consistent support tasks. *Major Responsibilities / Accountabilities:* * Provide specialized technical support for Thomson Reuters products. * Record all customer queries, interactions, and investigation progress in the CRM tools provided (Salesforce). * Keep clients updated throughout case life-cycle. * Follow all policies and procedures for managing and escalating customer issues to reduce resolution times. * Interface with product support and development groups. * Work with 3rd party service providers. * Perform break fix activities affecting customer sites remotely and arrange for on-site dispatches when required using global consistent methodologies and tools. * Provide expert technical support for problem resolution, including reproduction of customer issues. * Provide high-quality technical advice to internal stakeholders and 3rd party engineers. * Maintain awareness of relevant technical and product trends through self-learning/study, training classes, and job shadowing. * Maintain client-site documentation. * Escalate major, elusive, and recurrent issues that are impacting clients. * Able to work in morning shift (start at 15:00GMT). * Able to work weekend on rotation basis. * May be required to work as part of the project implementation team to integrate Thomson Reuter’s products at customer site. * May be required to deliver technology or product training to customers. Thomson Reuters provides professionals with the intelligence, technology and human expertise they need to find trusted answers. We enable professionals in the financial and risk, legal, tax and accounting, and media markets to make the decisions that matter most, all powered by the world's most trusted news organization. *Technical/Professional Skills & Competencies:* * English fluency required * Additional business level Japanese or Cantonese speaking and writing skill is a plus * Excellent technical knowledge of Cisco, Juniper, Linux (preferably CCNA/Linux or similar qualifications). * Strong Problem Management, analytical, troubleshooting and ticket management skills. Logical thinker/problem solver who is self-motivated and a strong contributor within a team. * Introductory knowledge of TCP/IP, UDP, Multi-Cast, BGP, QoS, VLAN, VPN, MPLS, SSL, iGRP, OSP, IGMP, EIGRP and IP routing protocols * Compass Open Stack, Elastic Search, Kibana, Graphana, or other tools is a plus * Ability to communicate effectively both verbally and in writing with customers and colleagues at all levels of technical and non-technical skill sets. * Experience in a customer service environment and having outstanding Customer Service skills. * Ability to work with virtual teams to successfully deliver projects or resolutions to escalations. * Good understanding of project management principles. * Basic level knowledge of financial markets. * Independent worker with excellent time management and escalation skills. * Demonstrates can-do attitude in challenging situations. At Thomson Reuters, we believe what we do matters. We are passionate about our work, inspired by the impact it has on our business and our customers. As a team, we believe in winning as one – collaborating to reach shared goals, and developing through challenging and meaningful experiences. With over 50,000 employees in more than 100 countries, we work flexibly across boundaries and realize innovations that help shape industries around the world. Bring your ambition to make a difference. We’ll bring a world of opportunities. As a global business we rely on diversity of culture and thought to deliver on our goals. To ensure we can do that, we seek talented, qualified employees in our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under country or local law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. Intrigued by a challenge as large and fascinating as the world itself? Come join us. To learn more about what we offer, please visit thomsonreuters.com/careers. More information about Thomson Reuters can be found on thomsonreuters.com. **Job:** **Professional & Consulting Services Family Group* **Organization:** **F&R CO Customer Support* **Title:** *Technical Specialist* **Location:** *Missouri-St. Louis-St. Louis-717 Office Pkwy* **Requisition ID:** *17001038*
          (USA-WA-Seattle) Applied Scientist   
Seeking Applied Researchers to build the future of the Alexa Shopping Experience at Amazon. At Alexa Shopping, we strive to enable shopping in everyday life. We allow customers to instantly order whatever they need, by simply interacting with their smart devices such as Echo, Fire TV, and beyond. Our services allow you to shop, anywhere, easily without interrupting what you’re doing – to go from “I want” to “It’s on the way” in a matter of seconds. We are seeking the industry's best applied scientists to help us create new ways to shop. Join us, and help invent the future of everyday life. The products you would envision and craft require ambitious thinking and a tireless focus on inventing solution to solve customer problems. You must be passionate about creating algorithms and models that can scale to hundreds of millions of customers, and insanely curious about building new technology and unlocking its potential. The Alexa Shopping team is seeking an Applied Scientist who will partner with technology and business leaders to build new state-of-the-art algorithms, models and services that surprise and delight our voice customers. As part of the new Alexa Shopping team you will use ML techniques such as deep learning to create and put into production models that deliver personalized shopping recommendations, allow to answer customer questions and enable human-like dialogs with our devices. The ideal candidate will have a PhD in Mathematics, Statistics, Machine Learning, Economics, or a related quantitative field, and 5+ years of relevant work experience, including: · Proven track record of achievements in natural language processing, search and personalization. · Expertize on a broad set of ML approaches and techniques, ranging from Artificial Neural Networks to Bayesian Non-Parametrics methods. · Experience in Structured Prediction and Dimensionality Reduction. · Strong fundamentals in problem solving, algorithm design and complexity analysis. · Proficiency in at least one scripting languages (e.g. Python) and one large-scale data processing platform (e.g. Hadoop, Hive, Spark). · Experience with using could technologies (e.g. S3, Dynamo DB, Elastic Search) and experience in data warehousing. · Strong personal interest in learning, researching, and creating new technologies with high commercial impact. + · Track record of peer reviewed academic publications. · Strong verbal/written communication skills, including an ability to effectively collaborate with both research and technical teams and earn the trust of senior stakeholders. AMZR Req ID: 551723 External Company URL: www.amazon.com
          The Ultimate Data Infrastructure Architect Bundle for $36   
From MongoDB to Apache Flume, This Comprehensive Bundle Will Have You Managing Data Like a Pro In No Time
Expires June 01, 2022 23:59 PST
Buy now and get 94% off

Learning ElasticSearch 5.0


KEY FEATURES

Learn how to use ElasticSearch in combination with the rest of the Elastic Stack to ship, parse, store, and analyze logs! You'll start by getting an understanding of what ElasticSearch is, what it's used for, and why it's important before being introduced to the new features of Elastic Search 5.0.

  • Access 35 lectures & 3 hours of content 24/7
  • Go through each of the fundamental concepts of ElasticSearch such as queries, indices, & aggregation
  • Add more power to your searches using filters, ranges, & more
  • See how ElasticSearch can be used w/ other components like LogStash, Kibana, & Beats
  • Build, test, & run your first LogStash pipeline to analyze Apache web logs

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Ethan Anthony is a San Francisco based Data Scientist who specializes in distributed data centric technologies. He is also the Founder of XResults, where the vision is to harness the power of data to innovate and deliver intuitive customer facing solutions, largely to non-technical professionals. Ethan has over 10 combined years of experience in cloud based technologies such as Amazon webservices and OpenStack, as well as the data centric technologies of Hadoop, Mahout, Spark and ElasticSearch. He began using ElasticSearch in 2011 and has since delivered solutions based on the Elastic Stack to a broad range of clientele. Ethan has also consulted worldwide, speaks fluent Mandarin Chinese and is insanely curious about human cognition, as related to cognitive dissonance.

Apache Spark 2 for Beginners


KEY FEATURES

Apache Spark is one of the most widely-used large-scale data processing engines and runs at extremely high speeds. It's a framework that has tools that are equally useful for app developers and data scientists. This book starts with the fundamentals of Spark 2 and covers the core data processing framework and API, installation, and application development setup.

  • Access 45 lectures & 5.5 hours of content 24/7
  • Learn the Spark programming model through real-world examples
  • Explore Spark SQL programming w/ DataFrames
  • Cover the charting & plotting features of Python in conjunction w/ Spark data processing
  • Discuss Spark's stream processing, machine learning, & graph processing libraries
  • Develop a real-world Spark application

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Rajanarayanan Thottuvaikkatumana, Raj, is a seasoned technologist with more than 23 years of software development experience at various multinational companies. He has lived and worked in India, Singapore, and the USA, and is presently based out of the UK. His experience includes architecting, designing, and developing software applications. He has worked on various technologies including major databases, application development platforms, web technologies, and big data technologies. Since 2000, he has been working mainly in Java related technologies, and does heavy-duty server-side programming in Java and Scala. He has worked on very highly concurrent, highly distributed, and high transaction volume systems. Currently he is building a next generation Hadoop YARN-based data processing platform and an application suite built with Spark using Scala.

Raj holds one master's degree in Mathematics, one master's degree in Computer Information Systems and has many certifications in ITIL and cloud computing to his credit. Raj is the author of Cassandra Design Patterns - Second Edition, published by Packt.

When not working on the assignments his day job demands, Raj is an avid listener to classical music and watches a lot of tennis.

Designing AWS Environments


KEY FEATURES

Amazon Web Services (AWS) provides trusted, cloud-based solutions to help businesses meet all of their needs. Running solutions in the AWS Cloud can help you (or your company) get applications up and running faster while providing the security needed to meet your compliance requirements. This course leaves no stone unturned in getting you up to speed with administering AWS.

  • Access 19 lectures & 2 hours of content 24/7
  • Familiarize yourself w/ the key capabilities to architect & host apps, websites, & services on AWS
  • Explore the available options for virtual instances & demonstrate launching & connecting to them
  • Design & deploy networking & hosting solutions for large deployments
  • Focus on security & important elements of scalability & high availability

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Wayde Gilchrist started moving customers of his IT consulting business into the cloud and away from traditional hosting environments in 2010. In addition to consulting, he delivers AWS training for Fortune 500 companies, government agencies, and international consulting firms. When he is not out visiting customers, he is delivering training virtually from his home in Florida.

Learning MongoDB


KEY FEATURES

Businesses today have access to more data than ever before, and a key challenge is ensuring that data can be easily accessed and used efficiently. MongoDB makes it possible to store and process large sets of data in a ways that drive up business value. Learning MongoDB will give you the flexibility of unstructured storage, combined with robust querying and post processing functionality, making you an asset to enterprise Big Data needs.

  • Access 64 lectures & 40 hours of content 24/7
  • Master data management, queries, post processing, & essential enterprise redundancy requirements
  • Explore advanced data analysis using both MapReduce & the MongoDB aggregation framework
  • Delve into SSL security & programmatic access using various languages
  • Learn about MongoDB's built-in redundancy & scale features, replica sets, & sharding

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Daniel Watrous is a 15-year veteran of designing web-enabled software. His focus on data store technologies spans relational databases, caching systems, and contemporary NoSQL stores. For the last six years, he has designed and deployed enterprise-scale MongoDB solutions in semiconductor manufacturing and information technology companies. He holds a degree in electrical engineering from the University of Utah, focusing on semiconductor physics and optoelectronics. He also completed an MBA from the Northwest Nazarene University. In his current position as senior cloud architect with Hewlett Packard, he focuses on highly scalable cloud-native software systems.

Learning Hadoop 2


KEY FEATURES

Hadoop emerged in response to the proliferation of masses and masses of data collected by organizations, offering a strong solution to store, process, and analyze what has commonly become known as Big Data. It comprises a comprehensive stack of components designed to enable these tasks on a distributed scale, across multiple servers and thousand of machines. In this course, you'll learn Hadoop 2, introducing yourself to the powerful system synonymous with Big Data.

  • Access 19 lectures & 1.5 hours of content 24/7
  • Get an overview of the Hadoop component ecosystem, including HDFS, Sqoop, Flume, YARN, MapReduce, Pig, & Hive
  • Install & configure a Hadoop environment
  • Explore Hue, the graphical user interface of Hadoop
  • Discover HDFS to import & export data, both manually & automatically
  • Run computations using MapReduce & get to grips working w/ Hadoop's scripting language, Pig
  • Siphon data from HDFS into Hive & demonstrate how it can be used to structure & query data sets

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Randal Scott King is the Managing Partner of Brilliant Data, a consulting firm specialized in data analytics. In his 16 years of consulting, Scott has amassed an impressive list of clientele from mid-market leaders to Fortune 500 household names. Scott lives just outside Atlanta, GA, with his children.

ElasticSearch 5.x Cookbook eBook


KEY FEATURES

ElasticSearch is a Lucene-based distributed search server that allows users to index and search unstructured content with petabytes of data. Through this ebook, you'll be guided through comprehensive recipes covering what's new in ElasticSearch 5.x as you create complex queries and analytics. By the end, you'll have an in-depth knowledge of how to implement the ElasticSearch architecture and be able to manage data efficiently and effectively.

  • Access 696 pages of content 24/7
  • Perform index mapping, aggregation, & scripting
  • Explore the modules of Cluster & Node monitoring
  • Understand how to install Kibana to monitor a cluster & extend Kibana for plugins
  • Integrate your Java, Scala, Python, & Big Data apps w/ ElasticSearch

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Alberto Paro is an engineer, project manager, and software developer. He currently works as freelance trainer/consultant on big data technologies and NoSQL solutions. He loves to study emerging solutions and applications mainly related to big data processing, NoSQL, natural language processing, and neural networks. He began programming in BASIC on a Sinclair Spectrum when he was eight years old, and to date, has collected a lot of experience using different operating systems, applications, and programming languages.

In 2000, he graduated in computer science engineering from Politecnico di Milano with a thesis on designing multiuser and multidevice web applications. He assisted professors at the university for about a year. He then came in contact with The Net Planet Company and loved their innovative ideas; he started working on knowledge management solutions and advanced data mining products. In summer 2014, his company was acquired by a big data technologies company, where he worked until the end of 2015 mainly using Scala and Python on state-of-the-art big data software (Spark, Akka, Cassandra, and YARN). In 2013, he started freelancing as a consultant for big data, machine learning, Elasticsearch and other NoSQL products. He has created or helped to develop big data solutions for business intelligence, financial, and banking companies all over the world. A lot of his time is spent teaching how to efficiently use big data solutions (mainly Apache Spark), NoSql datastores (Elasticsearch, HBase, and Accumulo) and related technologies (Scala, Akka, and Playframework). He is often called to present at big data or Scala events. He is an evangelist on Scala and Scala.js (the transcompiler from Scala to JavaScript).

In his spare time, when he is not playing with his children, he likes to work on open source projects. When he was in high school, he started contributing to projects related to the GNOME environment (gtkmm). One of his preferred programming languages is Python, and he wrote one of the first NoSQL backends on Django for MongoDB (Django-MongoDBengine). In 2010, he began using Elasticsearch to provide search capabilities to some Django e-commerce sites and developed PyES (a Pythonic client for Elasticsearch), as well as the initial part of the Elasticsearch MongoDB river. He is the author of Elasticsearch Cookbook as well as a technical reviewer of Elasticsearch Server-Second Edition, Learning Scala Web Development, and the video course, Building a Search Server with Elasticsearch, all of which are published by Packt Publishing.

Fast Data Processing with Spark 2 eBook


KEY FEATURES

Compared to Hadoop, Spark is a significantly more simple way to process Big Data at speed. It is increasing in popularity with data analysts and engineers everywhere, and in this course you'll learn how to use Spark with minimum fuss. Starting with the fundamentals, this ebook will help you take your Big Data analytical skills to the next level.

  • Access 274 pages of content 24/7
  • Get to grips w/ some simple APIs before investigating machine learning & graph processing
  • Learn how to use the Spark shell
  • Load data & build & run your own Spark applications
  • Discover how to manipulate RDD
  • Understand useful machine learning algorithms w/ the help of Spark MLlib & R

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Krishna Sankar is a Senior Specialist—AI Data Scientist with Volvo Cars focusing on Autonomous Vehicles. His earlier stints include Chief Data Scientist at http://cadenttech.tv/, Principal Architect/Data Scientist at Tata America Intl. Corp., Director of Data Science at a bioinformatics startup, and as a Distinguished Engineer at Cisco. He has been speaking at various conferences including ML tutorials at Strata SJC and London 2016, Spark Summit, Strata-Spark Camp, OSCON, PyCon, and PyData, writes about Robots Rules of Order, Big Data Analytics—Best of the Worst, predicting NFL, Spark, Data Science, Machine Learning, Social Media Analysis as well as has been a guest lecturer at the Naval Postgraduate School. His occasional blogs can be found at https://doubleclix.wordpress.com/. His other passion is flying drones (working towards Drone Pilot License (FAA UAS Pilot) and Lego Robotics—you will find him at the St.Louis FLL World Competition as Robots Design Judge.

MongoDB Cookbook: Second Edition eBook


KEY FEATURES

MongoDB is a high-performance, feature-rich, NoSQL database that forms the backbone of the systems that power many organizations. Packed with easy-to-use features that have become essential for a variety of software professionals, MongoDB is a vital technology to learn for any aspiring data scientist or systems engineer. This cookbook contains many solutions to the everyday challenges of MongoDB, as well as guidance on effective techniques to extend your skills and capabilities.

  • Access 274 pages of content 24/7
  • Initialize the server in three different modes w/ various configurations
  • Get introduced to programming language drivers in Java & Python
  • Learn advanced query operations, monitoring, & backup using MMS
  • Find recipes on cloud deployment, including how to work w/ Docker containers along MongoDB

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Amol Nayak is a MongoDB certified developer and has been working as a developer for over 8 years. He is currently employed with a leading financial data provider, working on cutting-edge technologies. He has used MongoDB as a database for various systems at his current and previous workplaces to support enormous data volumes. He is an open source enthusiast and supports it by contributing to open source frameworks and promoting them. He has made contributions to the Spring Integration project, and his contributions are the adapters for JPA, XQuery, MongoDB, Push notifications to mobile devices, and Amazon Web Services (AWS). He has also made some contributions to the Spring Data MongoDB project. Apart from technology, he is passionate about motor sports and is a race official at Buddh International Circuit, India, for various motor sports events. Earlier, he was the author of Instant MongoDB, Packt Publishing.

Cyrus Dasadia always liked tinkering with open source projects since 1996. He has been working as a Linux system administrator and part-time programmer for over a decade. He works at InMobi, where he loves designing tools and platforms. His love for MongoDB started in 2013, when he was amazed by its ease of use and stability. Since then, almost all of his projects are written with MongoDB as the primary backend. Cyrus is also the creator of an open source alert management system called CitoEngine. He likes spending his spare time trying to reverse engineer software, playing computer games, or increasing his silliness quotient by watching reruns of Monty Python.

Learning Apache Kafka: Second Edition eBook


KEY FEATURES

Apache Kafka is simple describe at a high level bust has an immense amount of technical detail when you dig deeper. This step-by-step, practical guide will help you take advantage of the power of Kafka to handle hundreds of megabytes of messages per second from multiple clients.

  • Access 120 pages of content 24/7
  • Set up Kafka clusters
  • Understand basic blocks like producer, broker, & consumer blocks
  • Explore additional settings & configuration changes to achieve more complex goals
  • Learn how Kafka is designed internally & what configurations make it most effective
  • Discover how Kafka works w/ other tools like Hadoop, Storm, & more

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Nishant Garg has over 14 years of software architecture and development experience in various technologies, such as Java Enterprise Edition, SOA, Spring, Hadoop, Hive, Flume, Sqoop, Oozie, Spark, Shark, YARN, Impala, Kafka, Storm, Solr/Lucene, NoSQL databases (such as HBase, Cassandra, and MongoDB), and MPP databases (such as GreenPlum).

He received his MS in software systems from the Birla Institute of Technology and Science, Pilani, India, and is currently working as a technical architect for the Big Data R&D Group with Impetus Infotech Pvt. Ltd. Previously, Nishant has enjoyed working with some of the most recognizable names in IT services and financial industries, employing full software life cycle methodologies such as Agile and SCRUM.

Nishant has also undertaken many speaking engagements on big data technologies and is also the author of HBase Essestials, Packt Publishing.

Apache Flume: Distributed Log Collection for Hadoop: Second Edition eBook


KEY FEATURES

Apache Flume is a distributed, reliable, and available service used to efficiently collect, aggregate, and move large amounts of log data. It's used to stream logs from application servers to HDFS for ad hoc analysis. This ebook start with an architectural overview of Flume and its logical components, and pulls everything together into a real-world, end-to-end use case encompassing simple and advanced features.

  • Access 178 pages of content 24/7
  • Explore channels, sinks, & sink processors
  • Learn about sources & channels
  • Construct a series of Flume agents to dynamically transport your stream data & logs from your systems into Hadoop

PRODUCT SPECS

Details & Requirements

  • Length of time users can access this course: lifetime
  • Access options: web streaming, mobile streaming
  • Certification of completion not included
  • Redemption deadline: redeem your code within 30 days of purchase
  • Experience level required: all levels

Compatibility

  • Internet required

THE EXPERT

Steve Hoffman has 32 years of experience in software development, ranging from embedded software development to the design and implementation of large-scale, service-oriented, object-oriented systems. For the last 5 years, he has focused on infrastructure as code, including automated Hadoop and HBase implementations and data ingestion using Apache Flume. Steve holds a BS in computer engineering from the University of Illinois at Urbana-Champaign and an MS in computer science from DePaul University. He is currently a senior principal engineer at Orbitz Worldwide (http://orbitz.com/).

          Java Posse #435   

Roundup ‘13 - Cloud Performance Monitoring

Fully formatted shownotes can always be found at http://javaposse.com

Please join us for the 2014 Java Posse Roundup, from Feb 24th to 28th in Crested Butte, CO. The subject this year is Software Engineering Trends. More details and registration at http://www.mindviewinc.com/Conferences/JavaPosseRoundup/

Thanks

    • Opening - "Java" the parody song Copyright 1997 Broken Records and Marjorie Music Publ. (BMI),

    • Closing - Juan Carlos Jimenez - In the House (Intro No. 1)

  • To contact us:


The Java Posse consists of Tor Norbye, Carl Quinn, Chet Haase and Dick Wall


          Java Posse #434   

Roundup ‘13 - Efficient Searching

Fully formatted shownotes can always be found at http://javaposse.com

Please join us for the 2014 Java Posse Roundup, from Feb 24th to 28th in Crested Butte, CO. The subject this year is Software Engineering Trends. More details and registration at http://www.mindviewinc.com/Conferences/JavaPosseRoundup/

Thanks

    • Opening - "Java" the parody song Copyright 1997 Broken Records and Marjorie Music Publ. (BMI),

    • Closing - Juan Carlos Jimenez - In the House (Intro No. 1)

  • To contact us:


The Java Posse consists of Tor Norbye, Carl Quinn, Chet Haase and Dick Wall


          Developer (ADMIN BIG DATA) - Standard Bank - Constantia Kloof, Gauteng   
Chef Elastic Search/ Logstash/Kibana1-. Standard Bank is a firm believer in technical innovation, to help us guarantee exceptional client service and leading...
From Standard Bank - Thu, 29 Jun 2017 00:06:14 GMT - View all Constantia Kloof, Gauteng jobs
          Developer (ADMIN BIG DATA) - Standard Bank - Constantia Kloof, Gauteng   
Chef Elastic Search/ Logstash/Kibana1-. Standard Bank is a firm believer in technical innovation, to help us guarantee exceptional client service and leading...
From Standard Bank - Thu, 29 Jun 2017 00:06:14 GMT - View all Constantia Kloof, Gauteng jobs
          Developer (ADMIN BIG DATA) - Standard Bank - Constantia Kloof, Gauteng   
Chef Elastic Search/ Logstash/Kibana1-. Standard Bank is a firm believer in technical innovation, to help us guarantee exceptional client service and leading...
From Standard Bank - Thu, 29 Jun 2017 00:06:14 GMT - View all Constantia Kloof, Gauteng jobs