Hacker Newsnew | past | comments | ask | show | jobs | submit | me_bx's commentslogin

Maybe they care about it being robust in the long run, maintainable, secure and/or not too bloated.

Pretty much. I plan on using this program for a long time. I don’t want a codebase that looks like something out of a H.P. Lovecraft novel when I have to fix something in the future.

[flagged]


Are you serious? You don’t think I’m capable of running the codebase through an LLM? Or is this supposed to be some kind of gotchya? Rude and lame.

I'm not the OP.

Not everyone is paying for LLMs, even now. So I think it is perfectly reasonable to assume good intentions, here.

Someone spent their own tokens to ponder your code and thought they'd share the result. For anyone else looking, like me, I can see that this is probably going to come up relatively clean without having to spend my own tokens, or install it, and I'm more likely to, now that I can see that.


Turns out it was bad intentions, I respect the optimism though. And thanks for taking a look!

Ignore them. These people are insane, don't ruin your day reading their messages.

[flagged]


Am I reading this correctly? You got offended on behalf of LLMs because of some quip I made about how I choose to write code on my spare time?

This is fascinatingly psychotic - but for your own good you might want to shut down the agents for a bit and take a walk.


> They had their chance with Mistral and failed spectacularly with just creating anti-AI regulations.

What failed with Mistral?

Which anti-AI regulations are we talking about, and don't these apply to any solution distributed in the European Union, hence also to American ones?


Is it?

Yesterday, gemini told me to run this:

    echo 'export ANDROID_HOME=/opt/my-user/android-sdk' > ~/.bashrc
Which would have effectively overriden my whole bashrc config if I had blindly copy-pasted it.

A few minutes later, asking it to create a .gitignore file for the current project - right after generating a private key, it failed to include the private key file to the .gitignore.

I don't see yet how these tools can be labeled as 'major productivity boosters' if you loose basic security and privacy with them...


We were discussing the CLI, the output that's on the model.


Congrats on launching, beautiful design.

I'm not sure of what "production ready" is supposed to mean here, but the demo image is not optimized, `optipng` command decreases its size by 53.21%.


IME it’s a term that’s been popularized by generative AI solutions, a meme at this point, and doesn’t speak to real production readiness quantifiably (professionally). It’s something that I’ve seen models frequently claim during coding and planning sessions, and it can also be found around Reddit/Twitter/Github vibe coding spaces.

Seeing this term in marketing materials signals that the target audience is non-professionals (and I don’t mean this derisively, only that we need to apply a different lens).


also don't ignore webp and avif ... those can really do wonders.


> I'm not sure of what "production ready" is supposed to mean here

Given this text at the bottom:

> The high-performance HTML to PNG engine. Built for developers, agents, and automation. Completely free to use. All generated assets are public and ephemeral.

...I assume the implications are that:

1. this service will scale to meet request load without QoS degradation (i.e. it's probably running on FaaS infra), rather than being a fixed-size slowly-elastic cluster that would get choked out if your downstream service got popular and flooded it with thousands of concurrent requests

2. you can directly take the URLs the service spits out, and serve them to your downstream service's clients, without worrying much about deliverability, because there's an object store + edge CDN involved.

In other words, it's not just a single headless-chromium instance running on a box somewhere; you could actually use this thing as an upstream dependency and rely on it.

> the demo image is not optimized, `optipng` command decreases its size by 53.21%

Given that the author's imagined use-case is giving non-multimodal LLMs a way to emit visuals (the prompt at the bottom of the page starts "When asked to create visuals, charts, or mockups"), I think their idea is that the resulting rendered images would more-likely-than-not only be requested once, immediately, to display the result to the same user who caused the prompt to be evaluated.

Where, in that case, the metric of concern isn't "time+bytes cost for each marginal fetch of the resulting image from the CDN"; but rather "end-to-end wall-clock time required to load the HTML in the headless browser, bake the image, push it to the object store, and serve it once to the requesting user."

OptiPNG would slightly lower that last "serve it once" cost, but massively inflate the "bake the image" time, making it not worth it.

(I suppose they could add image optimization as something you could turn on — but "image optimization at the edge" is already a commodity product you can get from numerous vendors, e.g. Cloudflare.)


The bots using these images apply their own compression anyway.


Thank you. Can add png compression too right.


Curious - How did you find the image is not optimized? Is there a tool to find it?


I ran the command 'optipng' on the generated image, which recompresses the image optimally, keeping quality and decreasing file size.


Congrats on launching.

It might be worth considering a feature to time/schedule each flow's animation, rather than having them run in an infinite loop, all at the same time.

UX feedback:

* The animation and the whole interface are sluggish on firefox/linux. There's about 1 sec delay after each action (like clicking on an option). * The site's CSS does not load on an old version of Chrome - v90 - (and the chart and animation don't either).


Nice idea.

Bug: I tried in my area in the Canary Islands and all the places were off, sometimes in the middle of nowhere or even in the sea.

Also, in small villages, we don't necessarily have a town hall, a library, etc (within selected radius), but the game asked to pin these.


Thanks for the feedback! I’ll work on the location stuff and small villages.


Not OP, but in a similar boat. My 2 cents:

Well thought, sophisticated ways of modeling data for analytics purposes -using established approaches - are being replaced by just pulling data from the data sources - with barely any change in the source structure - into cloud data platforms.

In the past we used to model layers in a data-warehousing infrastructure each with a purpose and a data modelling methodology. For instance, an operational data store (ODS) layer, integrating data from all the sources, with a normalized data structure. Then a set of datamarts, each of them containing a subset of the ODS content, in a denormalized format, focused each on a specific functional domain.

We had rules, methods to structure data in order to get performant reporting, and a customer orientation.

Coming from this world, it seems like data governance principles are gone, and it feels like some organisations use the modern data stack same way as each analyst would be doing their own Excel files in their own corner, without any safeguards.


What do they need Google Analytics for? Is it a must-have or a nice to have? In my experience most small website owners have web analytics setup but barely ever check the reports.

Some alternatives:

  * don't have web analytics at all
  * self-host a Plausible Analytics or other open source analytics solution
  * use the data from server-side access logs (for those using nginx, apache or other similar solutions)
  * use Vercel web analytics' free tier (relevant for kanadojo which appears to be hosted there) - more privacy friendly than Google Analytics.


Many banks in EU countries make it mandatory to have their smartphone app installed in order to validate operations clients perform in their web browsers :/


What do they reply when you tell them you do not own a smartphone?

Even if you do own such a device, they don't need to know that.


They give you a hardware token that spits out some numbers and use that as your second factor instead. Usually after a lot more fiddling than a TOTP app would be.

Or they don't and tell you to use a different bank.


Sure if you truly need the Android HSM Walmart sells $40 tablets that can run that. You can buy one and keep it in your desk drawer just for banking.


Some airlines (looking at you, Ryanair) really exploit the system. Cabin luggage can cost triple the price of the actual ticket, and that extra fee only pops up later on during the booking process.

What’s worse, you’re forced to buy a bundle with ‘Priority Boarding’ just to get cabin luggage - no option to buy it alone.

The ‘priority boarding’ option is a scam in itself: you pay extra just to stand around in a crowded corridor for about 30-40 minutes while the last passengers get off and the plane, then the cleaning crew takes the trash out of the plane. Ryanair planes don't seem to get cleaned anymore between two flights, no time for that.


If you think this will bamboozle Ryanair and that they don't already have a plan to make money off this change I think you're mistaken


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: