I love Splunk, I really do. I’m using it since 2005 or so, and while I don’t have always a need for it, it often allowed me to build the impossible, up to crazy stuff like squeezing gig’s of carefully truncated logs over a ISDN line and then allowing everyone to happy analysis on them locally.
I just wish there was a different pricing model for troubleshooting only.
What I’d want now:
- scponly access to a -oneshot uploader
- 20GB volume
- 3 days retention
- <$150 price tag at 20GB, and under $50 for <5GB
Setting up a temporary Splunk VM and Splunk, even manually, takes less than an hour. But larger sets of logs will immediately hit the volume limit.
The thing is:
I don’t want to be able to upload 500mb each day for free and store it forever, as the basic license allows – that equals adding another 185GB per year btw.
I don’t want to be able to upload 20GB each day and store it for 30 days as Sumologic does around <this much> money – that equals placing 600GB of data on their disks continously.
What I often need is something that takes less than 5 min to access. That means, get it set up and ready to push existing logfiles to. Then work on them. Then forget about them.
The last time with Sumologic i had been faster installing a Splunk VM for oneshot than find out how to feed them all existing data now and still have it nicely indexed. But then the volume was just a few 100 meg.
And then I wanna run a few smart queries, maybe for 1-3 days, 7 days would be cool to even show the customer the findings online. But the actual value drops once I’m done searching.
So, one upload of data (assuming it worked), not one each day. And no, definitely I don’t need 30 days retention, after 30 days I don’t even want to remember I have the data there, much more so since most log data is sensitive and shouldn’t be stored for 30 days anyway.
Performance isn’t a great factor either, I’ll always get better one right here – if i feel it helps I can just turn on a real server, and a powerful one, not the run-the-mill cloud-SATA-stuff. So the selling point is simply to let me start analyzing faster than I can do now, and do it for less money:
The price limit is be set by the time I really care about this data, plus it has to be less than the few days for switching to logstash cost me?
The perfect features and speed of splunk versus the freedom i’d get with logstash (just think about using opennebula instances, spin up 4 fresh, dedicated instances for each time I need to analyze logs? how long would it take – 5 minutes?)
Let’s say I’d spend $5000 to set up a “perfect” logstash lab env, then it’d do this job well for at least 5 years.
Let’s ignore I’d then also earn money by selling such setups.
That means $999 per year would be the top limit to get the benefits off having a readily available system – especially if it means setting up readonly accounts for the affected parties etc. and all those Splunk enterprise features.
Splunk would run around $12k for this. Sumo around $5000, because they all assume I’d be pushing data daily and wanna store it.
I don’t have to. It doesn’t work.