Explore, analyze & understand complex data structures with JSON Crack
This web-based tool has a lot of cool features: node search, embeddable widgets, downloadable images, dark mode & more. Jump straight to the online editor to try it out.
This web-based tool has a lot of cool features: node search, embeddable widgets, downloadable images, dark mode & more. Jump straight to the online editor to try it out.
This is a neat little CLI utility from Kelly Brazil that parses the output of common Unix system commands into JSON/YAML. We’re linking to the latest version (1.22.0) which adds new magic parsers for /proc
files. This handy glue-type tool gets one click neater every release!
We recently logged jqq (an interactive wrapper around jq
), but Ruby means installing it can be a burden. jqp
wraps gojq, so it compiles to a universal binary and the UI is pretty compelling as well.
This is cool because portability. But also because you can embed it as a library in your Go projects. It’s not identical to jq
in practice, though. Here’s a long list of differences between the two.
Nick Nisi shared this with me recently and it’s cool, so I thought I’d pass it along. Here’s Nick describing it:
And it is a visual wrapper around
jq
, that kind of does thefzf
type thing where as you’re writing out your query, it’s live showing you a preview in virtual text of exactly what would get returned by what you’re querying as you go. So you can use that as a nice tool to build out yourjq
syntax, or yourjq
query, and in real time get that feedback.
The only bummer is that it’s written in Ruby. Don’t get me wrong, I love Ruby. But it requires you to have Ruby tooling on your machine to use jqq
, which many people don’t have or want. BUT it’s a mere 241 lines of code, so porting it to something a little more portable shouldn’t be too much work…
Anton Zhiyanov shows how you can store non-normalized data in SQLite and use virtual columns to query against it without the massive performance penalty that json_extract
would otherwise incur.
The second jq
alternative we’ve discovered this week! (first here)
jq
is hard to use. There are alternatives likefq
andzq
, but they still make you learn a new programming language. Im tired of learning new programming languages.
gq
is not optimized for speed, flexibility or beauty.gq
is optimized for minimal learning/quick usage.gq
understands that you don’t use it constantly, you use it once a month and then forget about it. So when you come back to it,gq
will be easy to relearn. Just use the builtin library just like you would any other go project and you’re done. No unfamiliar syntax or operations, or surprising limits. Thats it.
I don’t know if Go is a great fit for this use-case, but if you already know it well… makes sense.
If you’ve found the (excellent) jq
tool for working with JSON a bit unwieldy… check out zq
and see if you like its API any better. I wouldn’t put too much weight on the faster aspect, though:
We will cover zq’s performance in a future article, but to cut to the chase here, zq is almost always at least a bit faster than jq when processing JSON inputs
Almost always at least a bit faster is not something you’re likely to notice in practice.
A nice little web-based tool to help you quickly/visually grok some JSON data you’re working with. I tested it on some game data from an old JS Danger episode, and it handled it pretty well.
Automerge is a Conflict-Free Replicated Data Type (CRDT), which allows concurrent changes on different devices to be merged automatically without requiring any central server.
I could see this library being super useful in many applications.
JSON support in SQLite has been in the works for awhile now, but prior to this release you had to opt in to it. Now it’s on by default and this is a great rundown of using the query features.
Why create another form library? Here’s what Fomir’s author says to that:
I have tried many form libraries, like redux-form, formik, final-form, react-hook-form… None of them suit my taste. I would expect a forms library with these features:
- Using schema
- Easy to update form state
- High Performance
- More abstract
Fomir create form by passing a form schema which is a JSON tree. the form schema is very flexible, you can create any form by the schema.
I could see this as especially useful for form builders and similar tools where you’re providing a graphical way to build forms because your backend would just have to emit the correct JSON and let Fomir take it from there.
Daniel Stenberg’s first step toward adding first-party JSON support to everyone’s favorite command-line URL transmitter is a flag that is basically a shortcut for the following flags:
--data [arg]
--header "Content-Type: application/json"
--header "Accept: application/json"
Will more come of this? Time (and the community) will tell…
The discussion has been ignited in the curl community about what, if anything, we should do in curl to make it a smoother and better tool when working with JSON. The offered opinions range from nothing (“curl is content agnostic”) to full-fledged JSON generator, parser and pretty-printer (or a combination in between).
Sonic outperforms other Go implementations on all JSON sizes. Here are the results on a large dataset. 👇
This is a solid primer on the usefulness of jq
(a lightweight, command-line JSON processor.)
In this article, I’m going to go over the basics building blocks of jq in enough depth that you will be able to understand how jq works. Of course, you still might occasionally need to head to google to find a function name or check your syntax, but at least you’ll have a firm grounding in the basics.
Too often, web tiers are full of boilerplate that does nothing except convert a result set into JSON. A middle tier could be as simple as a function call that returns JSON. All we need is an easy way to convert result sets into JSON in the database.
PostgreSQL has built-in JSON generators that can be used to create structured JSON output right in the database, upping performance and radically simplifying web tiers. Fortunately, PostgreSQL has such functions, that run right next to the data, for better performance and lower bandwidth usage.
I certainly wouldn’t advise this in many (most?) scenarios, but I can see a time and a place where “cutting out the middle man” would be quite advantageous, indeed. Keep it simple. Keep it lean.
Craig Kerstiens:
Postgres has had “JSON” support for nearly 10 years now. I put
JSON
in quotes because well, 10 years ago when we announced JSON support we kinda cheated. We validated JSON was valid and then put it into a standard text field. Two years later in 2014 with Postgres 9.4 we got more proper JSON support with theJSONB
datatype. My colleague @will likes to state that the B stands for better. In Postgres 14, the JSONB support is indeed getting way better.
A small but solid improvement to how you query JSONB
, making it more JSON
-y than ever.
jq
is a hugely useful tool for anyone dealing with JSON of varying shapes and sizes. If that’s you, but you haven’t given jq
a serious try, this is a great little primer on its use and use cases.
Peter Ohler:
I had a dream. I’d write a fast JSON parser, generic data, and a JSONPath implementation and it would be beautiful, well organized, and something to be admired. Well, reality kicked in and laughed at those dreams.
This post lays out Peter’s plan, his journey, and his lessons learned in great details. Seems like it’d pair nicely with the recent Go Time all about JSON.
JSON (JavaScript Object Notation) is used all over the web as a text-based way of transmitting data. In this episode, we explore Go’s encoding/json package, and others with Daniel Marti.
I love this idea of having a singular, parseable data source for your resume that can be read & formatted in many different contexts & places. Once cool example of this is react-ultimate-resume which uses JSON Resume as its data source.
This does not mean curl
can fetch some JSON and print it to STDOUT
. That would not be new. What it means is that the --write-out
option now supports JSON as an output format. Pipe that output to a tool like jq
and you get something like this:
{
"url_effective": "https://example.com/",
"http_code": 200,
"response_code": 200,
[lots more but I snipped them for length]
}
Which is pretty cool, if you ask me.
Think of this like jq, but for people who love parentheses. 😀
cat test.json | jql '(elem "countries" (elem (keys) (elem "name")))'
[
"Poland",
"United States",
"Germany"
]
When demoing Hyperview to new engineers, there’s one comment that frequently comes up about the HXML data format:
“XML, really? It’s bloated and outdated. Why not use JSON? It’s the future.”
These comments imply that JSON is the “one true file format” that should be used for everything, but we don’t believe there’s such a thing. Each format makes tradeoffs in encoding, flexibility, and expressiveness to best suit a specific use case.
The author makes a pretty solid argument that JSON is better for lists, while XML is better for trees.
JSON formatted files are readable to humans but the lack of comments decreases readability. With JSONC, you can use block (
/* */
) and single line (//
) comments to describe the functionality. Microsoft VS Code also uses this format in their configuration files likesettings.json
,keybindings.json
,launch.json
, etc.
This is a Go-only implementation, but the concept is portable to any language (hint, hint).