Jump to content

IFR

Members
  • Posts

    1,080
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

IFR's Achievements

Council Member

Council Member (8/8)

  1. I'm carrying on this converstion on many fronts, so even if I was quoting you, I wanted to emphasize my perspective here, in case other people wanted to quote my post. Everything at its quantum base is non-deterministic. The human mind is more of a black box right now than most things. However, I don't see anything that would indicate that the human mind is beyond "solving" with an accurate model - a vast Monte Carlo simulation, if you will - given more in-depth knowledge of our neurological structure and vastly more sophisticated methods of computing this data. Do you believe that given an infinite timeline of advancing technology, that there is no route or way in which an accurate model of the human mind can be made? That the human mind in this is beyond all other phenomena in the universe? That would strike me as supernatural. I think this lands on one of the significant barriers in having a practical debate on this issue. It involves our respective estimations of the pace and magnitude of AI evolution, and this is something in all likelihood no one in this world can accurately assess. We know what AI is capable of right now. But what about 20 years in the future? When the Wright brothers flew Kitty Hawk in 1903, it may have seemed farfetched to many that aviation technology would take us to the moon in a little over 60 years later. Then again, in 1903 many probably had high hopes that battery technology would improve far more than reality had in store. The evolution of humans is a historical record. The evolution of machines is a complete unknown. Sometimes it's gradual, and sometimes major, paradigmatic changes cause a huge leap forward. So assessing AI capabilities in even the near future based on current AI technology is a conversation of beliefs more than anything. Sure, the human mind takes about 20 watts to make its computations, and the Microsoft servers running ChatGPT probably run on megawatts. But this difference in efficiency will improve to some degree. To what degree is an important qualification, and unfortunately the timeline of that improvement is impossible for us to know. Part of our mechanism is as efficient black boxes (for now; the human mind may one day be accurately modeled). I guess you can claim that our mechanism is unique and special in its function. But matter-antimatter annihilation is unique and special in its function. The fusion cross-section of deuterium and tritium is unique and special in its function. There are many natural phenomena that are optimized in their mechanism. When countless mechanisms are unique and special, why single out humans as something notable? We are developing black boxes that may be even more efficient than us one day, depending on the timeline of its evolution. So even our optimized mechanism may be surpassed. It's true. We simply cannot evaluate the ecosystem of AI and human feedback and what will result. Again, it's impossible to say what this ecosystem will be like. In my 12 Angry Men hypothetical, since we're talking about future movies being made, I wasn't trying to indicate a 12 Angry Men "derivative". I was trying to suggest that AI could make an entirely original movie that had the same original effect, meaning, and impact of 12 Angry Men, as though that movie had never existed before AI produced it. 12 Angry Men is not original. It is itself a derivative of previous stories. All stories are derivations of familiar, previously established stories. I think you can term "creativity" as eccentricities in reproducing a dataset. Akin to genetic mutations in evolution. 12 Angry Men happened to be a favorable collection of these data eccentricities. I think AI will become quite good at identifying eccentricities that are favorable to producing desirable results in the audience. Not just in terms of generating pleasure, but in giving us profound insight on life and who we are. I think this can be achieved with the direction of artists, but I think also this can be obtained independently without awareness, just by training AI to hone in on the appropriate ratio of data eccentricities. This is indeed personal preference. I think many people will choose a Game of Thrones that didn't end poorly. Or going back to a different discussion we had - a better Wheel of Time. I agree. There will always be a niche interest group. But really, this is entirely predicated on AI's advancement. And no one knows how that will evolve. If AI advances like the aviation technology did, then I don't see the popular choice to be seeking art that produces inferior stimuli (emotionally, intellectually - that is, in the sense of beauty and how we see the world and ourselves, etc.) simply because of what made it. If AI advances like battery technology, then yes, human art will probably dominate for a good long while.
  2. Yes, that is my projection. My original assertion was a speculation that the distinction that critics of AI art are drawing up will become less relevant. So naturally I am basing this on what I think will happen. Could technology be permanently stymied? Perhaps, but I see no indication it will. But the discussions in this thread have largely involved projections, so there is nothing exceptional about me making a projection. You immediately follow this quote with your own projection. This is a projection. Is it based on anything other than what you believe will happen (ie one can plausibly claim that machines will never be able to replicate human brains)? No. It's possible that this will be true, but I will go so far as to say that it's unlikely that it will be true. I claim this is subjective. A person who produces art has their own interface with reality in the process, and a person who perceives art has their own interface with reality, with the respective emotional and intellectual response of these interfaces. Uncoupling them does not devalue one or the other. Enjoying art that was not created by a human is not mutually exclusive to a human creating art. Both can exist. But then it becomes a competing market of ideas. If the AI art prevails as the dominant form of art in society, human art can still have its place. Doubtless there will still be a market for the consumer who refuses to partake of anything but human produced art, to satisfy their sensibilities that humans are inherently superior by the arbitrary mechanism of how their intelligence is expressed. I don't speak for your experiences. I simply refute that you can construct definitions that speak for my experiences, ie AI art is not art.
  3. No more than most people here. My field is health physics. "Unique" does not in any way inflate the importance of some mechanism to me. And humans are mechanisms. We are a system with our inputs and outputs. There is a one in a trillion trillion chance that a neutrino will interact with your body. Those are vanishingly rare odds. Yet it is also a very unremarkable result. It indeed will happen about a billion trillion times during the average person's life. It's simply something that occurs, nothing more. To be clear, something can be unique and have appropriate scale in accomplishing some mechanism that any other method would be inadequate in accomplishing. I mention neutrinos simply to point out that there are a lot of "unique" mechanics in this universe, and "unique" is a highly unimpressive thing in and of itself. To the rest of this quoted passage, I agree. AI is just a start. We don't know where this will lead, but it is still in its infancy. The rate at which AI is "learning" is certainly outpacing us in many ways. It has been a little under 80 years since Shockley, Bardeen and Brattain invented the transistor. While homo sapiens have undergone millions of years of evolution, in this short timeframe we've seen an advancement of simple calculations through punch card inputs, to the capability to emulate us. Transistor based intelligence is also a unique mechanism of inputs and outputs. And while in some ways this mechanism cannot yet compare to us, in some ways it far outpaces our own capabilities. And since the rate of advancement in this mechanism is so vastly quicker than the rate of our own advancement, it seems very reasonable to me to project that we will quickly be eclipsed in areas that the AI already shows promise in (such as art). People here keep falling over themselves trying to point that they don't consider it art if it is not generated with "awareness". HoI used "technical skill", I believe. Which is fine. But should this "technical skill" outpace us, then specifing that something may only be considered art if it is human sourced or there was "awareness" in its generation will not have any practical purpose. Imagine that I do a quick interview with an AI so it gets the sense of who I am and what my tastes are. Then I instruct it to make me a movie that is challenging and insightful, and the AI produces something like 12 Angry Men. I can watch this movie, or whatever superhero effluvia Marvel is spraying onto its audience. Now someone can insist that 12 Angry Men is not art because there was no "awareness" in how it was generated, and the Marvel effluvia is art because it's human sourced, but this definition would hold no meaning to me. How does this inform my experience of 12 Angry Men being full of meaning and beauty, while the Marvel effluvia is a soulless pollution of cinema? I mean, good for the humans for doing something with their day, I guess? But it's irrelevant to how I'm experiencing art. This aligns to the point Ran was making. My perspective is that "human mediated" will start to lose its distinction too. For now, you have to make specific prompts to generate art. But as things progress, is the request "make me a movie you think I would like" considered human mediated? That's essentially how human made movies are often created. A studio commissions a screenwriter to make an (often) specific kind of movie. The cast and crew are commissioned to produce that screenplay into a cinematic experience. People can still hold on to this distinction if they want. But I see it becoming less meaningful as AI advances. Indeed, and despite some important differences in our views, I do think we actually have a lot of overlap too.
  4. This is, to reiterate, a capricious position. There is no logical grounding to make this assertion. Anthropocentric narcissism is fine if it makes you feel better. But this position does not invalidate how one experiences "art". You can insist that the experience that you feel from beauty is not beauty because that is the arbitrary defintion you choose to amplify your importance in the universe, as a human being. But that experience itself will still be there, and indistinguishable from such an experience that is properly "human sourced". The consequences are that you end up in a topsy-turvy world where that which is ugly you call beautiful because it's "human", and beautiful is ugly because it's not "human". Which is fine, but I think this view may be increasingly bizarre.
  5. It's fine you have this opinion, and I don't think anyone is having trouble understanding it. But it's my speculation that this distinction will not have any practical use in the future, at the rate AI is advancing. Say a group produces a movie like Morbius. And say an AI produces a movie like The Shawshank Redemption. The group who produced Morbius may have put their heart and soul into the creation of that movie, but so what? The AI that created The Shawshank Redemption may have done so unaware, but so what? Call Morbius art and The Shawshank Redemption not art if you want, but that does not change the respective impact to the perceiver by these works. Of course, AI today cannot produce something with the same impact as The Shawshank Redemption.* Humans made that movie. It seems a lot of people here are skeptical it ever will, because apparently there's some ineffable unique quality about humans that nothing else can match. We're apparently the chosen ones of the universe that artificial minds can't compare to. The only way to equal us is through our special route of "aware" intelligence. I will absolutely dispute this. And I think the GPT4o demos are evidence of the further erosion of our uniqueness. If an unaware AI can so effectively and convincingly emulate a human conversation, then it is reasonable to project that other capabilities will emerge that are convincingly "human". And if that is the case, those stoutly insisting that art is the exclusive purview of the "aware" will do so while the sophistication, beauty, and intelligence of what AI produces eclipses what the critics call "art". *Just so there isn't a rehash of the same debate as HoI and PoA and I had, I again point out that this is a subjective assessment anyway.
  6. I think you are misunderstanding what I'm saying. If I say the most delicous item of food is a falafel, I'm not making a logical statement. I'm asserting a subjective judgment. If you disagree and announce that cheese pizza is the most delicious food, you are not making a logical statement either. You are providing a subjective rebuttal. No matter how strong the structure of logic either of us use in the ensuing discussion on the tastiest food, the inherent premise is based on a subjective stance. It's not logical.
  7. An interpretation to me is establishing an axiom upon which you then can base a system of logical reasoning. You form your interpretation of what art is, and then you use logic to make assertions based on that interpretation. I don't see how the formation of an opinion can be considered logical. I understand your opinion, and that's fine.
  8. This is the subjective, non-logical part, and the premise you are operating on. I disagree with this premise. My disagreement is equally subjective. There is no objective interpretation here. Hence, the conversation impasse.
  9. I was originally going with "uniqueness", then I decided to go with "specialness", so my comment applies to this post as well. At any rate, even if we disagree to an extent, I think we understand each other. Good discussion!
  10. I think I'm less enamored by the specialness of humans than is typical, so that inclines me more readily to believe that we can be surpassed.
  11. I agree, though the bolded part is something I find myself less skeptical of than you. I understand you well, I think, I simply did not try to argue the issue with you further. You are insisting on the point that "awareness" is an essential criteria for whether something produced is art. That's fine, but I think the conversation dead ends here. You are entitled to this personal definition of art. It's a subjective standard, so one cannot use logic to change your position, because it is not a position formed by logic. I will continue to assert that this will become an increasingly impractical definition to hold to in the future the closer AI comes to producing - in you words - "technically" superior art.
  12. Right, so if AI can create the best show ever it's art. Once we establish that, what if AI creates a show that is occassionally masterfully-written, but with some deep flaws? Let's use Westworld as an example. Still art? What if AI creates a highly divisive show, but one that has a niche audience? Let's use for an example the movie The Witch. It has an all right rating on imdb, but it has its adherents who think it's amazing (myself among them). Would that be art? How about a show virtually everyone thinks is really stupid, but is very popular? Let's take the Resident Evil movie franchise as an example. Is that art? At what arbitrary point of subjective "quality" are we declaring that "awareness" is the essential differentiation in producing art?
  13. That's what I thought. My assertion is that this position will likely encounter a lot of pressure to be reevaluated in the future. If AI starts making shows like The Wire, it won't be particularly useful to declare that it's not art because it wasn't made with "awareness".
  14. I understand that, because it is a condition that has been mentioned repeatedly throughout this thread. I am calling this condition capricious. If you have a group that is given a random sample of art that is a mixture of human generated and AI generated art, and the group, unaware that AI art is mixed in with the sample, does a blind rating of the art, and the result is that the AI art is consistently rated as more beautiful, profound, thought provoking, ect., than the AI art, is this irrelevant? Does subjective impact of art on those who perceive it have no relevance in its "meaning"? Is meaning unilateral, where it's a function only in the generation of art, not its perception. And let's observe that we are not discussing any objective definition of meaning, whatever your answer. I'm just getting a sense of what your capricious definition of meaning is.
  15. I mean, I don't want to devolve the discussion into a discourse on free will here, but obviously "meaning" has no objective meaning. So when critics dispute whether AI generated art has "meaning", the art does not qualify only due to the critic's capricious definition of the word.
×
×
  • Create New...