For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | Etheon's commentsregister

I created Multy 5 years ago. Thanks to a post on HN at that time, it was a small success (https://news.ycombinator.com/item?id=25870504)

Since then, I continue to maintain the website and I have around 50k lists, 8k users, and around 400€ MRR (ads and subscriptions).

I'd love to see more users, but I'm glad of what I did with multy !

If you want to check: https://multy.me


how do you deal with spam/abuse?


Dashboard and moderation. I created an admin dashboard to check users and lists and can easily delete / ban users.

Right now it's fine with the number of lists created per day


Thank you for your feedback!

That was my first idea, but as I thought about it I said to myself, what's the most important feature? Having a browser extension that can take a simple txt file from the web.

There are already some great tools for this (gist, pastebin in particular). Developing the infrastructure to manage the lists, although not very complicated, took a bit more time.

Here I have a simple html page to present the extension, the extension itself is hosted by Google (and soon Firefox, I hope...) and... that's it :)

But of course, if the idea takes off, I'll make it available!


Hmmmm this is weird, the metadata scrapper seems to have issue with twitter, it seems to work fine with any other websites I tried...

I'll check asap and try to fix it :) Thanks for the report


Thank you for your report, I'll try to fix the issues you mentioned. For the last one, it takes some times because I verify the metadata from the URLs, and check if the website exists. I don't know if it's a good idea but this is how it works right now. And this is why you see the target url with a 301.

I don't want users to do link cloaking on Multy and if people click on something on Multy, they are aware of where they're going. This is why I do that, and this is why I also want to integrate a domain blacklist.


Are there publicly available domain blacklists? I am curious on you would implement such a thing. Would you care to share?


Well, I am looking for this type of tool since this morning ^^

I found several BL, mostly for email spam protection but:

http://uribl.com/ https://www.spamhaus.org/

The difficulty is to found how to work with the database and check each URI.

I found some packages on npm like this one : https://www.npmjs.com/package/dnsbl-lookup

But I have to test it thoroughly before putting it into production.


I found this one (just by googling, no reference): https://github.com/Ultimate-Hosts-Blacklist/Ultimate.Hosts.B...


Wow thank you, seems awesome!!!


Thanks! And yes this is interesting.

For a few days now I've been thinking about an API system allowing users to create lists directly, without going through Multy. With a POST that would return a unique link to a JSON. This would allow people to integrate their own system to their site/application. If I had to monetize a part of Multy, maybe it would be this API, I don't know what you think?


Thank you :) Didn't know OneTab but it looks really useful for a guy like me with 100 open tabs.

I will check if I can create my own browser extension!


Thank you :)

So for example multy.me/xXere-1 and you'll be able to go directly to the link?


Sorry, I didn't do a great job explaining what I meant. Really just comes down to having each link on the multi-link page listed with a number beside its title.

--- Typically[1] i'll put numbers inline[2] for each reference I make[3]. And then at the bottom of the comment just list them as so:

[1] www.typical.com

[2] www.inline.com

[3] www.make.com

---

But if I replace the listed references with a single link at the bottom of the page to your site:

     References can be found here: https://multy.me/M9MU0A
Then it would be nice if the individual numbers (1,2,and 3) were obvious on that linked page beside each reference. But it's not a huge difference, it was just an initial thought when I first tried out your site. It's still useful even without numbered links, and I can't guess how many people would use them.


Ok I understand! Easier than I thought ;) Being able to have numbered links is definitely something I can do. I'll add that to my to-do list! It could be an option, not necessarily visible by default.


Well as it's one of my first "big" project as a developper, it can be scary, but my github project is public https://github.com/Etheonor/multy.me

Feel free to check it and I will be happy to have some feedback on my code. Since I'm self-taught, I'm sure I make big mistakes!


In my experience, every developer that I hired self-taught was 10x better at solving problems and getting things done themselves than traditionally taught.


I heard that a lot, but the impostor syndrome is always strong when you didn't do anything publicly before!


I second this. I went to college to study CS but most of my time learning to program was largely "self taught" lol.


Good question and right now I don't have an answer... I thought about checking a blacklist before create any list but I had trouble to find a good domain blacklist.

We can always handle spam with checking IP / number of requests but it's easy to say ;)


Just to keep you aware, this is a problem you probably need to solve sooner rather than later.

When you're discovered by spammers they will use your service to mask their true domain and share to social media sites which will in due course blacklist your domain. :-(

If volumes are low enough just now I'd consider some kind of manual moderation.

(Maybe you could use the referrer header, detect if the user has come from somewhere else and demand moderation for that case v.s. direct link emailed to a group)


I agree. I found this package https://github.com/Planeshifter/node-spam-detector

I'll check if it's outdated, and I'll try to build my own system.


Hey that's a great idea!

I never built an add-on for a browser so it will make a good challenge. Maybe for this week-end ^^


Yeah its not that bad, here is a simple bookmarklet example that grabs current url and posts it to your endpoint (mailephant in this case). Might be a good first step.

  javascript:q=location.href;if(document.getSelection)%7Bd=document.getSelection();
  %7Delse%7Bd='';%7D;p=document.title;void(open('https://mailephant.email/views/curate.html? 
  showtags=yes&url='+encodeURIComponent(q)+'&description='+encodeURIComponent(d)+'&
  title='+encodeURIComponent(p),'Mailephant','toolbar=no,scrollbars=yes,width=550,height=700'));


I had to add some newlines to your JS because it was borking the page layout. Sorry; it's our bug. One of these days we will fix it.


You just save that as a new bookmark after updated your endpoint and you can gather that data quickly.


Hello! This thread is no longer on the front page, but just to keep the few people still passing by, I finally created a browser extension! It completes the site :)

https://addons.mozilla.org/fr/firefox/addon/multy/

I'll send it to Google for Chrome as well


Thank you for the tips! I will definitely try that this week end and keep you updated.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You