Bigga
Well-Known Member
Some big films and TV series have been killed by the negative publicity around the productions because the people starring in it effectively put off a great deal of the viewing public.
Hollywood make some excellent stuff and it would stand up to scrutiny and sell well no matter what the competition is, but it's strangling itself just my opinion.
Yes, I agree and Hollywood is at a crossroads in terms of filmmaking rather than the producing of them. Well, a bit of both, really.
The 'Mary Sue' aspect of films, these days are a problem. The question is if films are a fantasy, in the most part, should it depict women in roles that are an expected generalisation in what they can do or should women be depicted in more 'natural' support roles with the occasional breakthrough in male roles, especially in action.
Are we seeing a reaction to unfamiliarity in depiction? In "Wicked" and "Barbie", these made a billion and more because it depicted familiarity and determination, supported by women. Action stars featuring female leads as 'Boss Babes' tend to bomb completely despite being made to show the women that saw "Barbie" that they could do this too.
So, what's the answer?

