New on the site: Michi-bot. An intelligent assistant based on the writings of Rabbi Michael Avraham.

Government against 2/3 of the public. Is it likely that this is just propaganda?

שו”תCategory: generalGovernment against 2/3 of the public. Is it likely that this is just propaganda?
asked 2 years ago

A survey was published tonight that found that about 2/3 of the public is against the revolution the government planned to make.
I didn’t see the details of the survey, but that’s what was published in the news. Hasid’s survey sounds credible.
Hmmm?
Just about 5 months ago, almost 49% of the public voted in favor of this government.
This is not a majority of the public, but it is much more than the roughly one-third that the survey actually gives it.
What happened when Karna fell?
Option A.
Although almost 49% wanted this government in general, there were far fewer who specifically supported the revolution.
Option B.
Although she had the support of almost half of the public, there was a growing devaluation of her support.
Option C.
The government did not explain itself, the opponents did, part of the public was convinced and the result: it dropped from a minority of less than 49% support to a minority of the third region.
Option D.
All answers are correct.
 
What does the Rabbi think is the correct answer?
Or are there other options?
 
 
 

Leave a Reply

0 Answers
מיכי Staff answered 2 years ago

I don’t see the importance of this discussion, why it belongs on the site, and especially what my added value is regarding it. I assume it is mainly the understanding of what the reform entails, and also its implications that were not clear until it began to be implemented. Of course, other options could also be correct, but this seems to me to be the main thing.

איש replied 2 years ago

I suppose there is also the possibility that the poll is wrong (I, for example, voted in the last election, but haven't answered any polls recently).

ב replied 2 years ago

After Haaretz fired Gadi Taub because it was wartime, there is a good chance that the polls are rigged. Usually, the pollsters of these polls belong to leftist circles (a fact that Likud voters enjoyed working on them in the television sample so much that Mina Tzemach wanted to resign because of the lies) because it is a subfield of the various media fields where the ”right opinions” are found: In any case, there are posts by Nadav Shnerb about conducting polls:

Post from 10/30/22:” A comment thrown by Amit Segal sent me to check what the hell the election pollsters are doing in this country. My conclusion: The pollsters do not report to you the results of the polls they conduct, but rather their guess about the election results after they have seen the polls.
Here is the argument. Polls usually sample 500 people randomly from the country's population. Now let's assume that United Torah Judaism has the support needed to win seven seats. How many people in the random sample will say they intend to vote for J?
To win seven seats, you need to get 5.83 percent of the vote. If you look at the numbers, you'll see that the pollster has to declare a party won 7 seats if, out of the 500 people he sampled, between 27 and 31 said they would vote for Goldenknopf's party [1].
And here's the question: Suppose there is a population in which exactly 5.83% of the people are Agudat voters. If I asked 500 random people, what is the probability that the number of people who say "Agudat Yisrael" will be between 27 and 31? I checked, and the answer is 0.367, not far from a third. Hence, if the pollsters report on the results of the polls, and not on their fantasies, in two-thirds of the polls, Egoy should have received a different number – five or six or nine, not exactly seven. 7 should appear more than any other number, but certainly not exclusively.
And hence the true results: Since the beginning of September, Channel 12 has conducted nine election polls, in all of which (!!) Gafni and Goldenknopf’s party received exactly seven seats. The same reliability was demonstrated by the pollsters of Channel 13 (nine out of nine), Kan 11 (7/7), Channel 14 (7/7), Israel Hayom (3/3) and Gal’ts (2/2). The only ones with fluctuations are the people of Maariv, who out of 8 polls had two who gave the union six, the rest (6/8) stuck to the legendary number seven.
So what are the chances that the pollsters are reporting to us the results they got in the poll? The chance that nine polls will give us exactly 7 seats, assuming, again, that the percentage of supporters of the union is 5.83, is about one in ten thousand. The chance that more than 40 polls (if I sample all the channels) will give only two different results than seven is, under the same conditions, something too small to talk about, a millionth of a millionth of a millionth or a little less.
I did a similar test for Re'em, where the situation is even worse. Out of 42 polls, only one, from Channel 12, gave a number different from four seats. Again – we are supposed to believe, as it were, that in 41 cases 500 random people were asked and in each of these cases between 17 and 20 people said they would vote for them. The probability – zero.
The equality between the blocs also reflects a similar statistical problem. In cases of a tie, or 61:59 in favor of one of the blocs, the large bloc should receive between 250 and 256 votes (otherwise it would have 62). Assuming that the public is divided exactly 50:50 between the blocs, we expect that in a sample of 500 random people there would be a slightly more than fifty percent chance that one bloc would win by a margin of more than two seats. I haven’t checked, but it seems to me that for some reason all the recent polls are in the range between 59 and 61. Again – a ridiculously improbable result.
In my opinion, we can assume with some certainty that all these survey businesses are fabricated. The pollsters start from some basic assumptions about the sizes of the populations that they “paint” as voters of specific parties, and choose the respondents in the survey not randomly but in a way that will give the ”correct” result. It is very likely that they also look left and right, examine the predictions of fellow pollsters and ”iron out” their results to assure themselves: it is much cheaper, in terms of image, to be wrong with everyone than to be wrong alone.
What does all this mean? As far as I understand, the conclusion is that it is possible (not certain, of course) that the pollsters are missing the mark, that there is no tie at all between the blocs and that parties will receive much less or much more than the polls predict for them. As always, the public (meaning you and I) tends to remember the bad and not the good, the big mistakes of the pollsters and not their successes. Therefore, we give pollsters a clear incentive to predict a draw (so they will never be too wrong) and to align their predictions with those of their peers (so they will err “with the flow”). The result – polls that are worth much less than we think”

Post from 20.2.23: “Here is an interesting phenomenon that I was exposed to through a discussion, on a slightly different topic, on Andrew Gelman's blog.
There is something that people call “public trust in the media”, and there are research institutes that try to measure this trust, both at the individual country level and in a comparative manner, in many countries around the world. One of these institutes is affiliated with Reuters, the other is called “Edelman's Trust Barometer”. What these institutes do is conduct public opinion polls in which the public is asked about their attitudes toward the media.
It turns out, to most people's surprise, that there is no connection between the results reached by the two institutes. There is no significant correlation either at the level of the result (what percentage does the A&M Institute give to public trust in Chile, say, compared to the B&M Institute) or at the level of trends (by how much public trust in Chile has decreased or increased over a given period of time).
What is going on here? I assume that two such respected institutes use standard review methods. Moreover, it is hard to imagine that they have any political interest, what does an American researcher care about how popular the media is in Indonesia or Turkey? Well, how come there are such differences, when there is essentially no connection between the results?
The person who raised the point suggested that the difference stems from the wording of the question. One institute asked, "Is the media doing the right thing?" While another asked people “Do they believe most of the news most of the time”. Interesting. But whatever, you see here how unreliable and inconsistent these types of public opinion polls are, even when they are conducted by professionals who have no vested interest.
Compare this to the election results: Here in Israel we have had five election campaigns within a four-year period, and the results have hardly changed. There was more or less zero transfer of votes between blocs, and apart from the story with Bennett who decided (according to him – with his eyes open) to publicly spit on his base, even the parties received more or less the same number of votes. The results changed from election to election only thanks to the “betrayals” of politicians in their bloc, or because of tiny fluctuations that were exacerbated by the high electoral threshold. After all the propaganda, the provocations, the wall guards, the corona – nothing moved. People vote for things that seem important to them, and the events, or the upheavals of the political debate, did not seem important enough to them.
Conclusion: When you are told about the “loss of public trust (in the High Court, the Knesset, the coach of the Israeli national team)” or about the “positions of Likud/left-wing voters” – take the story with a huge grain of salt. It is assumed that the way the question is phrased in such surveys dramatically affects the result, and when it comes to interested parties – it is a waste of time. Ignore. Simply ignore.”

מיכי Staff replied 2 years ago

Obviously. Anyone who says anything that doesn't suit the B.I.P.s is a leftist. And so the B.I.P. theory is irrefutable. Good luck.
As for the stupid claims you've made here, the fact is that most polls are incredibly accurate at predicting election results. Somehow their abilities drop dramatically in places where the results are irrefutable (because there are no elections to measure them). There, any B.I.P. can make “scholarly” claims that explain why the polls are worthless because everyone is leftist.

Leave a Reply

Back to top button