Hacker Newsnew | past | comments | ask | show | jobs | submit | jgrizou's commentslogin

What are you using now? Asking for a friend

Not OP, but I have been using borg backup [1] against Hetzner Storage Box [2]

Borg backup is a good tool in my opinion and has everything that I need (deduplication, compression, mountable snapshot.

Hetzner Storage Box is nothing fancy but good enough for a backup and is sensibly cheaper for the alternatives (I pay about 10 eur/month for 5TB of storage)

Before that I was using s3cmd [3] to backup on a S3 bucket.

[1] https://www.borgbackup.org/

[2] https://s3tools.org/s3cmd

[3] https://s3tools.org/s3cmd


I cannot edit any longer, the second link was supposed to be

[2] https://www.hetzner.com/storage/storage-box/


Thanks, I was quite confused for a second

its not replicated like bb, right?

This is quite a bit more expensive than Backblaze if you have more than 5 TBs or so

I use rsync.net. You can use basically any SSH tool or rclone interface. They have a cheaper plan for "experts" if you want to forgo zfs snapshots,https://www.rsync.net/signup/order.html?code=experts.

Rsync.net is really really good.

Just this weekend, my backup tool went rogue and exhausted quota on rsync.net (Some bad config by me on Borg.) Emailed them, they promptly added 100 GB storage for a day so that I could recover the situation. Plus, their product has been rock solid since a few years I've been using them.


> Emailed them, they promptly added 100 GB storage for a day so that I could recover the situation

Sorry to hear about your troubles. Hope your backup situation's sorted out?

Do you recall if you used a link like this to sign up?

https://www.rsync.net/signup/order.html?code=experts

If you don't, a good heuristic would be to see how much you pay per GB - if it's less than a cent, you probably did. The ones that come with support are typically a shade above per a cent per GB


Just a note of caution: sync != backup. When I was younger and dumber, I had my own rsync cron script to do a nightly sync of my documents to a remote server. One day I noticed files were gone from my local drive; I think there were block corruptions on the disk itself, and the files were dropped from the filesystem, or something like that. The nightly rsync propagated the deletions to the remote "backup."

D'argh.


Musing: There's a step further along the spectrum that echoes the relationship, where where "backup" != "backup that resists malware".

In other words, a backup can be degraded into a sync-to-nothing situation if the client logic is untrustworthy.


Thanks for your kind words.

Just to clarify - there are discounted plans that don't have free ZFS snapshots but you can still have them ... they just count towards your quota.

If your files don't change much - you don't have much "churn" - they might not take up any real space anyway.


I don't think a lot of people know you also support Borg 1.x (if you don't absolutely correct me!)

It would be incredible if you started to look into S3 compatible object stores, unless you have made a business decision not to support it.

Thank You for providing an affordable option for self hosters.


Depending on the usecase, rclone can expose an S3 endpoint via `rclone serve s3` to route to another protocol, eg sftp.

I mention it not to shill rsync.net, but to shill rclone, because when I discovered it I was even more impressed with it.

Obviously having to run a command and apply some amount of plumbing is different to a service just providing that API at the outset so the applicability for users will differ but still, rclone is very cool!


We will continue to specialize in filesystem provision, not object storage.

However, we do support interoperating with block storage, such as 's5cmd':

https://news.ycombinator.com/item?id=44248372

... and, of course, rclone, which you can invoke remotely, on our end to move data between cloud accounts, etc.


Thank You for that link!

> not object storage

Happy to email you, if that's better, but is this because of unsustainable competition in the space or the tremendous volatility in consumption that object storage customers bring to the table?

I ask because in this current market, I would imagine investing in storage infrastructure is painful, but then I wonder, you are still in the storage infrastructure space anyways, so it likely has to do with the user behavior or user expectations or both.


Not supporting S3 (and block storage) is not a business decision - it is an ideological decision.

We want to live in a world of UNIX filesystems and we want those to be available in the modern "cloud" ecosystem.


rsync.net and rclone are great, my brain understood restic easier than borg for local backups over usb (ymmv), and plain old `rsync --archive` is most excellent wrt preserving file mod times and the like.

There is 100% a difference between "dead data" (eg: movie.mp4) and "live data" (eg: a git directory with `chmod` attributes)- S3 and similar often don't preserve "attributes and metadata" without a special secondary pass, even though the `md5` might be the same.


You can do restic+rsync.net too if you didn't know. `rclone serve restic --stdio` on their end exposes the restic REST server over stdio, then with some fiddling of restic, telling it to use rclone as the repo address, and configuring its rclone args to actually invoke `ssh rsync.net`.

Again, with regards to my other comments, I am not affiliated with rsync, just rclone is cool. You can use the same trick with any host with ssh and rclone installed.


I have used Arq for way over a decade. It does incremental encrypted backups and supports a lot of storage providers. Also supports S3 object lock (to protect against ransomware). It’s awesome!

How is the performance? For me it takes Arq over an hour just to scan my files for changes.

(Arq developer here) By default Arq tries to be unobtrusive. Edit your backup plan and slide the “CPU usage” slider all the way to the right to make it go faster.

Restic to almost any cloud storage provider. It works perfectly, it is client-side encrypted with easily-configurable retention policies. I have been using it happily for many years (and also restored some files from the backups).

Wasabi + rclone works well for me. Previous BB customer.

Would it? Feels a bit like when you use Facebook and handover all your data.

Yeah, fair. I am just thinking out loud here. What is a decent solution to this problem? Is there one?

Works very well



This is not the same company. The OP Tiny Corp accused them of Trademark infringement on Twitter, due to exactly this kind of misconception.



You can still cancel your pledge. I’ve just done that myself.


During your sleep, we clone you fully down to brain connectome and burn your original body.

On waking up, we show you the video of what happened. You feel the same person as yesterday when you went to bed. Are you?


What is the difference with https://www.espruino.com/ ?


They are very similar in terms of ROM footprint (esp: 128K vs mqjs: 100K) and min RAM (esp: 8K vs mqjs: 10K), but spec coverage need to be examined in detail to see the actual difference.


How about comparing with Duktape (https://duktape.org)?


That is what charity shops do in the UK. You don't make money as your stuff are donation, but the charity does when then resell. Works surprisingly well.


Is this going to make web Bluetooth work via Chrome on iOS?



Why isn’t that possible right now? Couldn’t they just code the Bluetooth side and put a bridge on top of Safari to it?

Pretty sure some apps exist already that do this.


I have been trained in machine learning and started to notice a few early stage CTOs asking for advice on how to best start a ML project.

It seems many are tasked to implement an AI/ML roadmap for their start-up. But getting into ML make them experience a bit of an imposter syndrome, with all the jargon and technical sounding experts out-there.

They are all very capable individuals and I feel that this barrier to entry is unjustified with all the tools and models readily available. So, with some friends, we are trialing a 1.5h workshop to cover a few basics and demystify what ML is and what it can do. See https://lu.ma/06ec3e24

We designed this workshop to be as impactful as possible. If you are experienced software engineer getting into ML, consider joining us.

If initial feedback is good, we are thinking of working toward a 6-week "ML cohort-based masterclass" to help CTOs define a 6-18 month ML roadmap. This would include 1-on-1 sessions with a ML experts and access to curated exercises to do with the cohort.


The article is mostly about influence and the importance to surround yourself with people you want to learn from.

When you start rebeling, it might be a signal that it is time to find a new environment to strive in.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: