Pretty much. I plan on using this program for a long time. I don’t want a codebase that looks like something out of a H.P. Lovecraft novel when I have to fix something in the future.
Not everyone is paying for LLMs, even now. So I think it is perfectly reasonable to assume good intentions, here.
Someone spent their own tokens to ponder your code and thought they'd share the result. For anyone else looking, like me, I can see that this is probably going to come up relatively clean without having to spend my own tokens, or install it, and I'm more likely to, now that I can see that.
Which would have effectively overriden my whole bashrc config if I had blindly copy-pasted it.
A few minutes later, asking it to create a .gitignore file for the current project - right after generating a private key, it failed to include the private key file to the .gitignore.
I don't see yet how these tools can be labeled as 'major productivity boosters' if you loose basic security and privacy with them...
IME it’s a term that’s been popularized by generative AI solutions, a meme at this point, and doesn’t speak to real production readiness quantifiably (professionally). It’s something that I’ve seen models frequently claim during coding and planning sessions, and it can also be found around Reddit/Twitter/Github vibe coding spaces.
Seeing this term in marketing materials signals that the target audience is non-professionals (and I don’t mean this derisively, only that we need to apply a different lens).
> I'm not sure of what "production ready" is supposed to mean here
Given this text at the bottom:
> The high-performance HTML to PNG engine. Built for developers, agents, and automation. Completely free to use. All generated assets are public and ephemeral.
...I assume the implications are that:
1. this service will scale to meet request load without QoS degradation (i.e. it's probably running on FaaS infra), rather than being a fixed-size slowly-elastic cluster that would get choked out if your downstream service got popular and flooded it with thousands of concurrent requests
2. you can directly take the URLs the service spits out, and serve them to your downstream service's clients, without worrying much about deliverability, because there's an object store + edge CDN involved.
In other words, it's not just a single headless-chromium instance running on a box somewhere; you could actually use this thing as an upstream dependency and rely on it.
> the demo image is not optimized, `optipng` command decreases its size by 53.21%
Given that the author's imagined use-case is giving non-multimodal LLMs a way to emit visuals (the prompt at the bottom of the page starts "When asked to create visuals, charts, or mockups"), I think their idea is that the resulting rendered images would more-likely-than-not only be requested once, immediately, to display the result to the same user who caused the prompt to be evaluated.
Where, in that case, the metric of concern isn't "time+bytes cost for each marginal fetch of the resulting image from the CDN"; but rather "end-to-end wall-clock time required to load the HTML in the headless browser, bake the image, push it to the object store, and serve it once to the requesting user."
OptiPNG would slightly lower that last "serve it once" cost, but massively inflate the "bake the image" time, making it not worth it.
(I suppose they could add image optimization as something you could turn on — but "image optimization at the edge" is already a commodity product you can get from numerous vendors, e.g. Cloudflare.)
It might be worth considering a feature to time/schedule each flow's animation, rather than having them run in an infinite loop, all at the same time.
UX feedback:
* The animation and the whole interface are sluggish on firefox/linux. There's about 1 sec delay after each action (like clicking on an option).
* The site's CSS does not load on an old version of Chrome - v90 - (and the chart and animation don't either).
Well thought, sophisticated ways of modeling data for analytics purposes -using established approaches - are being replaced by just pulling data from the data sources - with barely any change in the source structure - into cloud data platforms.
In the past we used to model layers in a data-warehousing infrastructure each with a purpose and a data modelling methodology. For instance, an operational data store (ODS) layer, integrating data from all the sources, with a normalized data structure. Then a set of datamarts, each of them containing a subset of the ODS content, in a denormalized format, focused each on a specific functional domain.
We had rules, methods to structure data in order to get performant reporting, and a customer orientation.
Coming from this world, it seems like data governance principles are gone, and it feels like some organisations use the modern data stack same way as each analyst would be doing their own Excel files in their own corner, without any safeguards.
What do they need Google Analytics for? Is it a must-have or a nice to have? In my experience most small website owners have web analytics setup but barely ever check the reports.
Some alternatives:
* don't have web analytics at all
* self-host a Plausible Analytics or other open source analytics solution
* use the data from server-side access logs (for those using nginx, apache or other similar solutions)
* use Vercel web analytics' free tier (relevant for kanadojo which appears to be hosted there) - more privacy friendly than Google Analytics.
Many banks in EU countries make it mandatory to have their smartphone app installed in order to validate operations clients perform in their web browsers :/
They give you a hardware token that spits out some numbers and use that as your second factor instead. Usually after a lot more fiddling than a TOTP app would be.
Or they don't and tell you to use a different bank.
Some airlines (looking at you, Ryanair) really exploit the system. Cabin luggage can cost triple the price of the actual ticket, and that extra fee only pops up later on during the booking process.
What’s worse, you’re forced to buy a bundle with ‘Priority Boarding’ just to get cabin luggage - no option to buy it alone.
The ‘priority boarding’ option is a scam in itself: you pay extra just to stand around in a crowded corridor for about 30-40 minutes while the last passengers get off and the plane, then the cleaning crew takes the trash out of the plane. Ryanair planes don't seem to get cleaned anymore between two flights, no time for that.
reply