If you only lock on writes rather than reads, it prevents two writers from colliding, but readers can see torn data (a mix of old and new states). Technically speaking, if either the write or read is non-atomic, this is outright undefined behavior, and it's legal for the compiler to generate (eg. reading) code which assumes concurrent writes never happen and misbehaves if they do (for example, the reader thread reading the value twice, expecting the value to be the same, and misbehaving if it's not).
To prevent tearing, you can use a regular mutex (only one reader at a time), or https://en.wikipedia.org/wiki/Readers%E2%80%93writer_lock, which allows multiple concurrent readers, but has more overhead than a regular mutex, and when contention occurs can prioritize either readers or writers. If you don't want readers or writers to block, you can use (for <=64-bit values) atomic reads and writes, (for single writers and readers) triple buffers, or (in general) RCU or hazard pointers.
If the value is small enough, like 1 byte, and you know for a fact that your CPU + RAM will always be able to update it atomically, then yes technically you could get away with not locking on reads.
BUT, for large values (structs) you will end up getting ragged reads if another thread is writing at the same time. And if another thread is not writing at the same time, what's the harm in locking for reads? Modern mutexes lock fast if there's no contention.
BUT ALSO, many operations are actually a read, followed by a related write. If you don't lock on reads, it's easy to accidentally compose multiple small, correct functions, into a large, incorrect function.
Your comment reinforces nyanpasu64 point that the average C++ developer doesn't understand multithreading. Your comment is wrong in that it is never correct in a standard conforming C++ program to read and write to the same object from different threads without the use of a memory barrier, even if the CPU + RAM supports it. The C++ compiler itself will not support such an access pattern and the compiler may reorder instructions that are not protected by a barrier in such a way that results in undefined memory access.
It is always wrong. Compiler optimization and CPU dark magic like instruction reorder will wreak havoc on your unprotected reads. That is what undefined behavior really means, not “it’s actually defined but we don’t want to tell you”.
Which isn't a problem if you have blind judging of submissions--just make sure that your submission pool is diverse, and pick the best papers without reference to race, orientation, or gender.
..and if the field itself has a diversity problem, then what do you do? This is not a conference-level problem; it is a problem that starts much earlier in life and which will only be fixed by solving it earlier in life. Go to middle schools and figure out why girls who were doing well in math and science in elementary school suddenly lost interest in those subjects, and once you have worked your way from there to having more diversity in technical professions, we can talk about whatever diversity problems are left at conferences.
It may not be a conference level problem, but diversity at the conference level can go a long way to showing girls in middle school that STEM careers are viable for them. We know that a lot of the reason girls drop out along the way just is the perception of STEM as a male dominated field (and paralleling that, in countries like China where it's not considered strictly male, we see high female participation).
"diversity at the conference level can go a long way to showing girls in middle school that STEM careers are viable for them"
When was the last time you saw middle school students wandering around at a conference?
"We know that a lot of the reason girls drop out along the way just is the perception of STEM as a male dominated field"
Yes, clearly that's part of the problem. So why didn't the organizer of this conference go out of her way to invite middle school girls to see all the women she managed to get into the conference?
When I was an undergrad, the EE department had a problem: the policy of doubling female enrollment each year had to be revised to having female enrollment at all. Part of the solution was to print new admissions pamphlets that showed equal numbers of men and women, and equal numbers of white, Asian, and black people (none of these proportions even remotely reflected the reality of the department) smiling while working on their breadboard projects (also somewhat disconnected from reality). This is forgivable, of course, for the following reasons:
1. It is an advertisement. Advertisements always paint a rosier picture.
2. Nobody received any sort of career boost from being featured in the pictures.
3. The pamphlets were sent to high schools, which is exactly who the department needed to target to meet the goal of increased female enrollment.
Compare that to the conference:
1. Conferences are not advertisements for a field or job (usually)
2. Being invited to speak at a conference is a career-booster
3. The demographics at a conference have no impact on middle or high school girls' attitudes about math and science.
You continue to try to locate the problem in middle school while removing any means, direct or indirect, of addressing it. As you observe, speaking at a conference is beneficial to one's career; fostering female participation in conferences advances the careers (and visibility) of females in the field generally. You don't need to bus in a bunch of twelve year old girls and point at the female speakers. You just need the field itself not to look so totally male. In medicine, rising female participation outside of nursing has demonstrably accelerated the rate of female participation since the 1970s. The same can happen in the rest of the STEM field. It's obviously not a turnkey solution, but is valuable as part of a general attempt to diversify the field.
Yeah but in high performance web app the real competence for C and C++ are JVM languages (Java, Scala, etc.), not Ruby and Python. Twitter switched from Ruby to Scala/Java because Ruby performance was terrible.
Could they have used C/C++ for better performance? Yeah, but Java was goodEnough and for most high performance web apps it is GoodEnough, the cost being some more money for hardware (for Ruby and Python the cost is a lot higher and sometimes not even with expensive hardware you can solve the problems):
Just like C++ became GoodEnough and people switched to it from C for high performance apps, just like C became GoodEnough and we switched from assembly to C.
Are you saying this is always wrong? I thought that depended on the use case at hand.