This is not usually the case. Initial resume reviews are often done by HR, some algorithm, or at least somebody who is not who you will actually be working with on the day-to-day. It is incredibly difficult to root out all of the biases in the job application process and even automated attempts have failed[1]. It is best to avoid any opportunity for bias to even enter the system. To say that you're okay being rejected from a job due to your appearance because you wouldn't want that job anyway is defeatist and hurts the cause of those who are truly disadvantaged and would very much like to be employed.
Would you be interested in sharing your template? I used to use TeX, but the available templates never interested me, and I didn't have the time to DiY a good one.
It makes sense and it is how the internet works. Servers cherry pick who sees their content all the time. Scrapers are often blocked, as are entire IP address ranges. Things like Selenium server scrapers can be (approximately) detected and often are denied access.
I’m not sure about being anti-competitive. Serving a website is an action in which you open up your resources for others to access. My friend runs an open source stock market tracking website for free. He started getting hit with scrapers from big hedge funds and fintech companies a couple of months back. This costs him around $50-100 a month to serve all of these scrapers.
He and I both have similar free open source websites with donate buttons. They are rarely clicked. Ad revenue over a month for me has been ~$400 while donations over two years have totaled $20. There are about 80,000 unique visitors per month.
It is nice to think donation platforms can fund high traffic open source projects, but this is simply not the case.
In any regard, I fear the potential of this ruling limiting developers’ ability to protect their servers and making us all roll over to the big players with their hefty scrapers taking all of our data for resale.
how long are you allowed to delay results, I mean not serving results is just delaying them forever but that's out. Can I delay serving results longer than chromium's default timeout?
https://www.reuters.com/article/us-amazon-com-jobs-automatio...