Monolog Colored Line Formatter

colorscheme-default

Over a year ago I quickly whipped up a Colored Line Formatter for use with Monolog. As I’m building colorised output into Mixed Content Scan I – finally – took the time to actually put the darn thing out in the open.

bramus/monolog-colored-line-formatter is a formatter for use with Monolog. It augments the Monolog LineFormatter by adding color support. To achieve this bramus/monolog-colored-line-formatter uses ANSI Escape Sequences which makes it perfect for usage on text based terminals (viz. the shell).

Installation is possible via Composer

composer require bramus/monolog-colored-line-formatter ~1.0

To use it create an instance of it and set it as the formatter for the \Monolog\Handler\StreamHandler that you use with your \Monolog\Logger instance.

use \Monolog\Logger;
use \Monolog\Handler\StreamHandler;
use \Bramus\Monolog\Formatter\ColoredLineFormatter;

$log = new Logger('DEMO');
$handler = new StreamHandler('php://stdout', Logger::WARNING);
$handler->setFormatter(new ColoredLineFormatter());
$log->pushHandler($handler);

$log->addError('Lorem ipsum dolor sit amet, consectetur adipiscing elit.');

The color scheme can be adjusted if one needs to.

Monolog Colored Line Formatter →

Did this help you out? Like what you see?
Consider donating.

I don’t run ads on my blog nor do I do this for profit. A donation however would always put a smile on my face though. Thanks!

☕️ Buy me a Coffee ($3)

Bunyan

$ cat hi.js
var bunyan = require('bunyan');
var log = bunyan.createLogger({name: 'myapp'});
log.info('hi');
log.warn({lang: 'fr'}, 'au revoir');
$ node hi.js
{"name":"myapp","hostname":"banana.local","pid":40161,"level":30,"msg":"hi","time":"2013-01-04T18:46:23.851Z","v":0}
{"name":"myapp","hostname":"banana.local","pid":40161,"level":40,"lang":"fr","msg":"au revoir","time":"2013-01-04T18:46:23.853Z","v":0}

Bunyan is a simple and fast JSON logging library for node.js services

The true power comes when it’s combined with the bundled bunyan binary, which pretty-prints bunyan logs:

Like so:

$ node hi.js | bunyan
[2013-01-04T19:01:18.241Z]  INFO: myapp/40208 on banana.local: hi
[2013-01-04T19:01:18.242Z]  WARN: myapp/40208 on banana.local: au revoir (lang=fr)

It’s also possible to filter the messages:

$ node hi.js | bunyan -l warn
[2013-01-04T19:08:37.182Z]  WARN: myapp/40353 on banana.local: au revoir (lang=fr)
$ node hi.js | bunyan -c 'this.lang == "fr"'
[2013-01-04T19:08:26.411Z]  WARN: myapp/40342 on banana.local: au revoir (lang=fr)

node-bunyan →

(Found via Top 10 Mistakes Node.js Developers Make)

logstash

logstash

logstash is a tool for managing events and logs. You can use it to collect logs, parse them, and store them for later use (like, for searching). Speaking of searching, logstash comes with a web interface for searching and drilling into all of your logs.

Part of the ElasticSearch family. ES not required, as output can be sent to any destination you want, such as statsd (which in its turn can flush the data to graphite in order to visualize it).

logstash →

Logging client-side errors

function logError(details) {
  $.ajax({
    type: 'POST',
    url: 'http://mydomain.com/api/1/errors',
    data: JSON.stringify({context: navigator.userAgent, details: details}),
    contentType: 'application/json; charset=utf-8'
  }); 
}

window.onerror = function(message, file, line) {
  logError(file + ':' + line + '\n\n' + message);
};

Let’s keep this short. Too few websites log JavaScript errors. Let’s build a simple system to track client-side errors.

Makes clever use of the window.onerror event

Also comes with example serverside code to actually logging the event serverside

You Really Should Log Client-Side Errors →

I Know What You Downloaded on BitTorrent …

Imagine you’d write a crawler that connects to many torrent trackers for many torrents and then log all IP addresses that are also connected. Now, that’s exactly what You Have Downloaded does: it aggregates all public data and then exposes it.

We came up with the idea of building a crawler like this and keeping the maintenance price under $300 a month. There was only one way to prove our theory worked — to implement it in practice. So we did. Now, we find ourselves with a big crawler.

Although the site doesn’t track all traffic (and thus doesn’t have a list of all IP addresses — I for one have downloaded nothing according to the site … could swear I just pulled in a new Linux distro) it works rather well. The results aren’t surprising either. Typing in a IP address from one of the trolling commenters here on bram.us gives me …

Ouch! 😀

You Have Downloaded →