Kippo2ElasticSearch is a Python script to transfer data from a Kippo SSH honeypot MySQL database to an ElasticSearch instance (server or cluster). This is useful in terms of indexing and searching the dataset and makes easy to visualize important stats using Kibana.

The project also provides an exported Kibana dashboard file that you can import to your own instance and get immediate visualization results from your honeypot data. The two sample screenshots below show a portion of that dashboard.

DOWNLOAD Kippo2ElasticSearch

Kippo2ElasticSearch depends on the following Python modules: GeoIP, pony, pyes. Installing these is trivial via pip.




Skip to comment form

  1. this is interesting, kibana, sounds like from Japan 😀

    • harry on August 6, 2014 at 2:58 AM
    • Reply

    where do I put the file ? I’ve got kippo, kibana and elasticsearch running, but if i import the json file in the webinterface from kibana, it imports an empty dashboard.

    I suspect the py file needs to be loaded into elastic search but I could find out how or where… Any help ??

    1. Hi Harry.

      You don’t have to put the Python file “somewhere”. Just edit the file and enter the correct values for your MySQL database server and ElasticSearch service and run it.

      Not the py file, but the json file instead should be uploaded to Kibana.


        • harry on August 8, 2014 at 9:55 PM

        thanks for you answer, I figured it (python out the next day, i feel rather stupid 🙁
        i noticed the py script loads every hack-attempt to elasticsearch everytime you run it. I made cron run the py script to keep kibana up-to-date, but after 2 days the script already takes more than 1 minute to complete (+1000 attempts in 2 days, kind of fun to watch the try haha)

        Is there a way to let the script only load the new attempts ?

        • Ion on August 8, 2014 at 10:17 PM

        Hi Harry.

        Well, unless you find a way to keep state, then no. One solution could be to query ElasticSearch first and get the last doc’s ID and then use that in the SQL query and get only rows with an ID greater than that.

        But I have some news as well. I’ve created a fork of Kippo itself and added ElasticSearch support to it. So you can now save authorization attempts directly to ES. Please see my new blog post (going to publish it in some minutes) for more info.


        • harry on August 9, 2014 at 12:46 AM

        I get what you’re saying, but (as you probably noticed) i’m not proficient enough to implement such a construction 🙂 i’ll just set cron at a larger interval for now.

        Awesome news about your kippo fork, if i understand correctly, directly saving authorization attempts to ES should render my previous question moot, right ?

        Right now there are already 2000+ attempts (from ~30 ip’s); nearly all of them from china. Are these actual humans trying to get in, or are these just bots searching the internet for vulnerabilities ?

        • Ion on August 9, 2014 at 1:11 AM

        Hi Harry,
        blog post is going up soon, but yes. You will have the ability to add your ElasticSearch instance/cluster details in Kippo’s configuration file and then every connection attempt will be automatically logged in ES much like MySQL logging works (btw, you can use them together).

        Regarding the activity you’re seeing, I can’t be sure whether it’s a human or a bot, although mostly humans are attacking the SSH service (of course using automation). Bots exploit other kind of services, for example SMB/CIFS. You can try setting up Dionaea (using Dionaea-Vagrant for example) to catch the latter.


Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.