For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more sezna's commentsregister

This is not usually the case. Initial resume reviews are often done by HR, some algorithm, or at least somebody who is not who you will actually be working with on the day-to-day. It is incredibly difficult to root out all of the biases in the job application process and even automated attempts have failed[1]. It is best to avoid any opportunity for bias to even enter the system. To say that you're okay being rejected from a job due to your appearance because you wouldn't want that job anyway is defeatist and hurts the cause of those who are truly disadvantaged and would very much like to be employed.

https://www.reuters.com/article/us-amazon-com-jobs-automatio...


Would you be interested in sharing your template? I used to use TeX, but the available templates never interested me, and I didn't have the time to DiY a good one.



Thanks. These look good.


I've used this template as a starting point with solid success.

https://www.overleaf.com/latex/templates/cv-chandan/vgynfrhc...


It makes sense and it is how the internet works. Servers cherry pick who sees their content all the time. Scrapers are often blocked, as are entire IP address ranges. Things like Selenium server scrapers can be (approximately) detected and often are denied access.

I’m not sure about being anti-competitive. Serving a website is an action in which you open up your resources for others to access. My friend runs an open source stock market tracking website for free. He started getting hit with scrapers from big hedge funds and fintech companies a couple of months back. This costs him around $50-100 a month to serve all of these scrapers.


If he gives them a stable, fast API with a subscription fee, and the scrapers are truly from hedge funds, he’s going to make a lot more than $100/mo.


He should open up a Patreon, tip jar, something to get that funded.

Could also delay results, offer reduced temporal precision and other things to differentiate use cases.


He and I both have similar free open source websites with donate buttons. They are rarely clicked. Ad revenue over a month for me has been ~$400 while donations over two years have totaled $20. There are about 80,000 unique visitors per month.

It is nice to think donation platforms can fund high traffic open source projects, but this is simply not the case.

In any regard, I fear the potential of this ruling limiting developers’ ability to protect their servers and making us all roll over to the big players with their hefty scrapers taking all of our data for resale.


how long are you allowed to delay results, I mean not serving results is just delaying them forever but that's out. Can I delay serving results longer than chromium's default timeout?


Probably up to the point where a judge says 'this is blocking not delaying'.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You