• Welcome to Photrio!
    Registration is fast and free. Join today to unlock search, see fewer ads, and access all forum features.
    Click here to sign up

Artificial intelligence.

SCULPTURE :THE DIVER

A
SCULPTURE :THE DIVER

  • 0
  • 0
  • 47
CLOVER

A
CLOVER

  • 1
  • 0
  • 33

Forum statistics

Threads
202,222
Messages
2,837,476
Members
101,209
Latest member
D_bomb
Recent bookmarks
0

loccdor

Subscriber
Joined
Jan 12, 2024
Messages
2,904
Location
USA
Format
Multi Format
I'd like to see the data on click-throughs for videos and news articles that have AI thumbnails vs. those that don't, broken down by demographic groups. For example I'm less likely to watch a video if it has a cheesy AI thumbnail, especially an over-the-top one. However an audience must exist which finds it appealing or eye-catching, or it wouldn't be so common.
 

Alan Edward Klein

Member
Allowing Ads
Joined
Aug 29, 2017
Messages
10,202
Location
New Jersey formerly NYC
Format
Multi Format
AI doesn't have taste. Then again, most people are pretty tacky when it comes down to it. Me being one of them when I'm honest with myself.



I'd say closer to CGI in movies. People were floored by it when it started being used and then quickly tired of it when it was over used. Eventually it got so good that even if it was identified that it was obviously CGI it was ignored. People are still mad at George Lucas for that early 2000's edition of Star Wars where he CGIed over the original special effects.

Also Star Wars isn't for nerds. Star Trek is.

I;ve got the last year a TV was produced to show 3-D. It's a Sony. I thought it was fun watching 3-D but like others it gets boring and the intent of the movie is more important. One of the things special was that private shows were posted on YouTube such as the guy who rafted down the Grand Canyon with rapids and all. In 3-D that was really terrific.
 

Alan Edward Klein

Member
Allowing Ads
Joined
Aug 29, 2017
Messages
10,202
Location
New Jersey formerly NYC
Format
Multi Format
Nearly everything AI now facilitates in Lightroom or other photo editing tools has its roots in the darkroom, where experienced and skilled printers shaped images to their will through exceptional craft. It was just more time consuming so not many bothered to do these changes in the past. Seen that way, today’s tools feel less like a rupture and more like a continuation. So I don’t find it that alarming/dramatic

AI creates fake images of unreality. Its programmed graphic representation that looks like a photo image. It;s nothing like editing in the darkroom negatives that captured reality.
 

BrianShaw

Member
Allowing Ads
Joined
Nov 30, 2005
Messages
17,041
Location
La-la-land
Format
Multi Format
AI creates fake images of unreality. Its programmed graphic representation that looks like a photo image. It;s nothing like editing in the darkroom negatives that captured reality.

Has AI ever represented itself otherwise? So I don’t find it that alarming/dramatic
 

MattKing

Moderator
Moderator
Joined
Apr 24, 2005
Messages
55,501
Location
Delta, BC Canada
Format
Medium Format
AI creates fake images of unreality. Its programmed graphic representation that looks like a photo image. It;s nothing like editing in the darkroom negatives that captured reality.

Jerry Uelsmann.
1770837433318.png

Perhaps one of the best ever at this sort of darkroom work, but far from the only one.
I manipulate most of my darkroom printing, but usually in far more subtle and less consequential ways.
AI has the same relationship to photography as the stage has to story telling.
 

Alan Edward Klein

Member
Allowing Ads
Joined
Aug 29, 2017
Messages
10,202
Location
New Jersey formerly NYC
Format
Multi Format
Jerry Uelsmann.
View attachment 417822
Perhaps one of the best ever at this sort of darkroom work, but far from the only one.
I manipulate most of my darkroom printing, but usually in far more subtle and less consequential ways.
AI has the same relationship to photography as the stage has to story telling.

The exception that makes the rule.
 

MattKing

Moderator
Moderator
Joined
Apr 24, 2005
Messages
55,501
Location
Delta, BC Canada
Format
Medium Format
The exception that makes the rule.

Thanks for referring to me as exceptional Alan :smile:.
By the way, the saying is "The exception that proves the rule", and in that saying the use of "proves" is a relatively archaic one that we rarely see now - it essentially means "tests", and is most often now encountered in the context of alcohol. We refer to "80 proof rum" when we reference rum that tests out as 40% alcohol by volume.
To a certain extent Alan I think you are greatly influenced by the fact that you historically used slide film and projected it. Those of us who mainly shot slides tended to manipulate their images less, whereas those who did a lot of work printing in darkrooms tended to be more likely to manipulate their images.
I've spent a lot of time in both camps.
I can assure you that, outside of lab environments, manipulation at the time of printing has always been common. That includes newspaper work. Manipulation doesn't necessary add untruth - it usually adds emphasis or delineation. But it can certainly be used to add mystery or fancifulness.
The creation of montages - assemblies of multiple images into one - have been common for years.
I don't have a lot of that in my history, but there is a little.
I should see if I can find my published work that does that.
I can probably more easily find some of my wedding work where I dd that - although that was always done in camera. You know, the bride and groom imaged in a brandy snifter with lighted candle and wedding invitation at the side.
 

Pieter12

Member
Allowing Ads
Joined
Aug 20, 2017
Messages
8,289
Location
Magrathean's computer
Format
Super8
AI creates fake images of unreality. Its programmed graphic representation that looks like a photo image. It;s nothing like editing in the darkroom negatives that captured reality.
That is not what it is limited to. It is used, to great benefit, to enhance or modify existing images. Nothing fake beyond what had been traditionally been done by retouching--only better.
 

wiltw

Subscriber
Allowing Ads
Joined
Oct 4, 2008
Messages
6,726
Location
SF Bay area
Format
Multi Format
Has AI ever represented itself otherwise? So I don’t find it that alarming/dramatic

Yet folks continue to ask if AI will replace photography with cameras. Editing out undesirable elements using AI is a different matter, so is creating a fictitious image. So is an AI-controlled robot shooting a real scene.
 

Pieter12

Member
Allowing Ads
Joined
Aug 20, 2017
Messages
8,289
Location
Magrathean's computer
Format
Super8
Yet folks continue to ask if AI will replace photography with cameras. Editing out undesirable elements using AI is a different matter, so is creating a fictitious image. So is an AI-controlled robot shooting a real scene.

AI might replace certain aspects of commercial photography, such as scenic landscapes and product/catalogue work. J.Crew recently took a lot of flack for using AI-generated images for an ad campaign.
 

MattKing

Moderator
Moderator
Joined
Apr 24, 2005
Messages
55,501
Location
Delta, BC Canada
Format
Medium Format

Cholentpot

Member
Joined
Oct 26, 2015
Messages
7,050
Format
35mm
Everything J. Crew looked artificial before AI existed, so I'd ask the same question.

Many small glitches in the images, plus a big tell: on of the models feet is bent backwards underneath him. Search J.Crew AI ads for annotated images.

Right.

So J.Crew used a basic off the shelf free AI service which cranked out slop. If you pay, and pay attention then you can fool most people. However I'm happy they did this and that people noticed.
 

Sean

Admin
Admin
Allowing Ads
Joined
Aug 29, 2002
Messages
13,690
Location
New Zealand
Format
Multi Format
This would have sounded insane a year ago, now it is actually not so far fetched:

"Coding dies this year. Not evolves. Dies. By December, AI won’t need programming languages. It generates machine code directly. Binary optimized beyond anything human logic could produce. No translation. No compilation. Just pure execution."

An example would be you no longer code apps, you tell the ai what you want and it renders it. Interaction within the inner and outer working of the app are seen and processed by the ai which then renders any required output. No code.
 

koraks

Moderator
Moderator
Joined
Nov 29, 2018
Messages
27,656
Location
Europe
Format
Multi Format
I think there's most likely (it's difficult to verify/validate) a lot of truth to what he says - but there's also a lot of bias that results from the author's limitations in how he views the labor market. To an extent, this is made explicit in his reference to 'white collar workers', which of course explicitly excludes blue-collar workers as well as any professions where physical manipulation plays a major role. Now, arguably, those will be affected dramatically or perhaps even (partially) replaced just the same, but it does require a complementary revolution in the field of robotics - one that has been long in the making and despite very similar predictions like this man's but dating back to the 1950s and 1960s (!), they are remarkably slow to gain real traction.

This brings me to another shortcoming of the article. While I am more than willing to believe we're currently seeing dramatic (perhaps exponential) improvements in capabilities of AI models, the underlying assumption seems to be that this pace (linear or exponential) is inherently unlimited. A kind of Moore's Law for AI. I wonder if that's justified, and frankly, I really doubt it. I do think that we're seeing an acceleration of the pace, but at some point, the going will likely get tougher than it is right now. This is the same as in any other technological trajectory, although with information products/services, we see an accelerated pace overall compared to most physical products. There's always an S-curve of some sort: a slow start in which we're figuring the basics, then a dramatic increase in speed as we start to "get it", and then it slopes off again once we run into the inherent limitations of the concept. AI tech as we know it today also has those limitations. While we may not have run into them (much) for now (this is inherent to the phase we're in), this doesn't mean they're not there. What nobody knows is when exactly we'll run into that situation of diminishing returns.

Then there's another phenomenon he ignores or perhaps doesn't realize. It's the basic fact that people seek work. It's not just that people pick up the tasks that society somehow drops for them. It's very much the other way around. Employment is seen (and always has been in all societies somewhat similar to ours) as an essential part of human functioning, and as a result, I'm absolutely convinced that we will create jobs if for whatever reason we find there aren't any. History is rife with such projects, arguably, so there's nothing new under the sun in that regard. So while many, perhaps most jobs will change or disappear, new ones will replace the old ones. The lantern lighter doesn't exist anymore, either, neither does the coachman. And despite the existence of electrical street lights and automobiles, most people do still work.

Whether all work is equally useful, is of course something that can be debated. We all know about the 'bullshit jobs' that nobody (or at least, not those directly involved in them) seems to understand the purpose or utility of. But a sobering reality is that regardless of how bullshitty a job may seem, it at least serves one purpose: to keep a person busy, and create a socially acceptable logic for allocating resources to that person in order for them to participate in the economy. Cf. Parkinson's Law.

I'd also like to point to an associated parallel to AI - that of the computer. It shares many traits of what we presently see in AI. Computers are programmable, and thus they are multifunctional by nature. They can (and do) execute a vast range of tasks in the blink of an eye. The proliferation of personal computing in particular has affected virtually all jobs in society. Yet, jobs remain. Some (many) jobs have disappeared, sure. But new ones emerged.

I do not dispute the fact that AI will be thoroughly transformative to society. I also do not dispute the observation that the change may, to an extent, be revolutionary in terms of scale, scope and speed. These aspects constitute a major shock to society. But I don't think the next generation will find itself free of employment. We're not wired to not work; it's so deeply embedded into our genes and social programming that mass employment just isn't going to be wiped out.

I do think it's likely that we'll see a 'lost generation' that cannot keep up with technology and that finds itself incapable of adequately adjusting to whatever (unpredictably) new demands the future job market will impose on them. Until now, or at least until personal computing rose to prominence, people generally had sufficient time to accommodate to new job requirements because technological progress was gradual enough to allow us that time. That seems to have changed, and arguably, that change already began happening several decades ago. My father, who pioneered office/corporate automation in our country, experienced this first hand in the accountants who just couldn't make the switch from Hollerith machines to programmed multipurpose computing because the new way of working just did not mesh with their mental model of how a tabulating machine worked. It's a bit like trying to understand an automobile by remaining stuck at questions like "where do the oats go in, why are the legs circular and how come the animal doesn't have a head?"

Overall, I think the piece is relevant and insightful in places, but due to the fundamental shortcomings above, the practical implications should be taken with a grain of salt. For instance, he remarks w.r.t. what to tell your kids: "the people most likely to thrive are the ones who are deeply curious, adaptable, and effective at using AI to do things they actually care about. Teach your kids to be builders and learners, not to optimize for a career path that might not exist by the time they graduate." If I remove the word group "at using AI", the remainder was as true 30 years ago as it is today. The notion that the ability to learn (which essentially defines our concept of intelligence) is a differentiator regarding societal success was as true in the days the Egyptians built the pyramids as it is in the days of Claude and ChatGPT.

Conversely, he raves how nice it is that we can all learn new stuff today now, because knowledge is after all free now. Well, the question is whether it is any more or less free today than it was a century ago, and what 'knowledge' is in the first place. More importantly, while ChatGPT may give me a quick answer to even very complex questions, my own ability to make sense of that answer does not necessarily change. Put a very bright teacher next to a not-so-bright kid, and the kid won't become any brighter than they already are. They will learn to the best of their abilities, so there's an edge. But it's not as fundamental a gain as the author seems to content it is. The smarter guy or gal has the edge, and technology doesn't change that principle in a fundamental way. How the smartness manifests is of course context-specific. But that's something we can't derive from this blog post any more than we can predict other aspects of the future.
Coding dies this year.
Mental arithmetic died a long time ago. Calculus, algebra, methematics and statistics haven't. Coding changes fundamentally. The advent of higher level languages changed coding dramatically. How writes assembler today? I do a decent bit of register-level programming (these days with the help of AI of course, which makes it a whole lot faster and less painful) - strictly speaking, it's not necessary. I still do it. Maybe I'm crazy? Maybe....we all are?
 

warden

Member
Allowing Ads
Joined
Jul 21, 2009
Messages
3,244
Location
Philadelphia
Format
Medium Format
I think there's most likely (it's difficult to verify/validate) a lot of truth to what he says - but there's also a lot of bias that results from the author's limitations in how he views the labor market. To an extent, this is made explicit in his reference to 'white collar workers', which of course explicitly excludes blue-collar workers as well as any professions where physical manipulation plays a major role. Now, arguably, those will be affected dramatically or perhaps even (partially) replaced just the same, but it does require a complementary revolution in the field of robotics - one that has been long in the making and despite very similar predictions like this man's but dating back to the 1950s and 1960s (!), they are remarkably slow to gain real traction.

This brings me to another shortcoming of the article. While I am more than willing to believe we're currently seeing dramatic (perhaps exponential) improvements in capabilities of AI models, the underlying assumption seems to be that this pace (linear or exponential) is inherently unlimited. A kind of Moore's Law for AI. I wonder if that's justified, and frankly, I really doubt it. I do think that we're seeing an acceleration of the pace, but at some point, the going will likely get tougher than it is right now. This is the same as in any other technological trajectory, although with information products/services, we see an accelerated pace overall compared to most physical products. There's always an S-curve of some sort: a slow start in which we're figuring the basics, then a dramatic increase in speed as we start to "get it", and then it slopes off again once we run into the inherent limitations of the concept. AI tech as we know it today also has those limitations. While we may not have run into them (much) for now (this is inherent to the phase we're in), this doesn't mean they're not there. What nobody knows is when exactly we'll run into that situation of diminishing returns.

Then there's another phenomenon he ignores or perhaps doesn't realize. It's the basic fact that people seek work. It's not just that people pick up the tasks that society somehow drops for them. It's very much the other way around. Employment is seen (and always has been in all societies somewhat similar to ours) as an essential part of human functioning, and as a result, I'm absolutely convinced that we will create jobs if for whatever reason we find there aren't any. History is rife with such projects, arguably, so there's nothing new under the sun in that regard. So while many, perhaps most jobs will change or disappear, new ones will replace the old ones. The lantern lighter doesn't exist anymore, either, neither does the coachman. And despite the existence of electrical street lights and automobiles, most people do still work.

Whether all work is equally useful, is of course something that can be debated. We all know about the 'bullshit jobs' that nobody (or at least, not those directly involved in them) seems to understand the purpose or utility of. But a sobering reality is that regardless of how bullshitty a job may seem, it at least serves one purpose: to keep a person busy, and create a socially acceptable logic for allocating resources to that person in order for them to participate in the economy. Cf. Parkinson's Law.

I'd also like to point to an associated parallel to AI - that of the computer. It shares many traits of what we presently see in AI. Computers are programmable, and thus they are multifunctional by nature. They can (and do) execute a vast range of tasks in the blink of an eye. The proliferation of personal computing in particular has affected virtually all jobs in society. Yet, jobs remain. Some (many) jobs have disappeared, sure. But new ones emerged.

I do not dispute the fact that AI will be thoroughly transformative to society. I also do not dispute the observation that the change may, to an extent, be revolutionary in terms of scale, scope and speed. These aspects constitute a major shock to society. But I don't think the next generation will find itself free of employment. We're not wired to not work; it's so deeply embedded into our genes and social programming that mass employment just isn't going to be wiped out.

I do think it's likely that we'll see a 'lost generation' that cannot keep up with technology and that finds itself incapable of adequately adjusting to whatever (unpredictably) new demands the future job market will impose on them. Until now, or at least until personal computing rose to prominence, people generally had sufficient time to accommodate to new job requirements because technological progress was gradual enough to allow us that time. That seems to have changed, and arguably, that change already began happening several decades ago. My father, who pioneered office/corporate automation in our country, experienced this first hand in the accountants who just couldn't make the switch from Hollerith machines to programmed multipurpose computing because the new way of working just did not mesh with their mental model of how a tabulating machine worked. It's a bit like trying to understand an automobile by remaining stuck at questions like "where do the oats go in, why are the legs circular and how come the animal doesn't have a head?"

Overall, I think the piece is relevant and insightful in places, but due to the fundamental shortcomings above, the practical implications should be taken with a grain of salt. For instance, he remarks w.r.t. what to tell your kids: "the people most likely to thrive are the ones who are deeply curious, adaptable, and effective at using AI to do things they actually care about. Teach your kids to be builders and learners, not to optimize for a career path that might not exist by the time they graduate." If I remove the word group "at using AI", the remainder was as true 30 years ago as it is today. The notion that the ability to learn (which essentially defines our concept of intelligence) is a differentiator regarding societal success was as true in the days the Egyptians built the pyramids as it is in the days of Claude and ChatGPT.

Conversely, he raves how nice it is that we can all learn new stuff today now, because knowledge is after all free now. Well, the question is whether it is any more or less free today than it was a century ago, and what 'knowledge' is in the first place. More importantly, while ChatGPT may give me a quick answer to even very complex questions, my own ability to make sense of that answer does not necessarily change. Put a very bright teacher next to a not-so-bright kid, and the kid won't become any brighter than they already are. They will learn to the best of their abilities, so there's an edge. But it's not as fundamental a gain as the author seems to content it is. The smarter guy or gal has the edge, and technology doesn't change that principle in a fundamental way. How the smartness manifests is of course context-specific. But that's something we can't derive from this blog post any more than we can predict other aspects of the future.

Mental arithmetic died a long time ago. Calculus, algebra, methematics and statistics haven't. Coding changes fundamentally. The advent of higher level languages changed coding dramatically. How writes assembler today? I do a decent bit of register-level programming (these days with the help of AI of course, which makes it a whole lot faster and less painful) - strictly speaking, it's not necessary. I still do it. Maybe I'm crazy? Maybe....we all are?

It’s good to be having these conversations with younger people now. I have two sons in college and we are having conversations about artificial intelligence with some regularity. One of them is preparing for a career that Shumer thinks will be immediately impacted (negatively) by AI and the other is preparing for a career that Shumer thinks is relatively safe for now. Either way I think the key is flexibility and being willing to adjust as necessary.

…And of course learning how to use AI beyond asking it sophomoric questions that any search engine could answer for you.
 

wiltw

Subscriber
Allowing Ads
Joined
Oct 4, 2008
Messages
6,726
Location
SF Bay area
Format
Multi Format
We think of the negative impacts of AI, but there are also the positive results. I only recently learned about two uses of AI that are beneifiting mankind
  • AI analyzes chemical compounds and suggest (much more quickly) similar compounds that should be investigated to resolve medical issues, faster than human scientists can come up with potential solutions.
  • AI analyzes existing chemical compounds with one medical use, and suggests alternate medical uses
I am well aware of some of the criminal uses of AI, and my wife has a good friend who was victimized by what AI can do. losing thousands of dollars to fraud made possible by AI. But there are benefits from AI, too.
 

MattKing

Moderator
Moderator
Joined
Apr 24, 2005
Messages
55,501
Location
Delta, BC Canada
Format
Medium Format

Manaloge

Member
Allowing Ads
Joined
Feb 9, 2026
Messages
3
Location
Belgium
Format
35mm RF
Like every new technology it has it's advantages and disadvantages.

AI helped me a great deal in my photo developming. Quickly you get an answer what you did wrong. No offense but on places like this you'll get 100 people giving different answers and in the end there's no clear solution.

See chat gpt as a fast search tool to start with, so you can search for the deeper stuff more efficiently.

Like in every profession vs technology, people always will pay more for the real thing, the real artists. The superficial (news) articles that only survive one day, will be made by AI. But for higher end photography, there will always be demand for men.
 

Cholentpot

Member
Joined
Oct 26, 2015
Messages
7,050
Format
35mm
I just heard the first interview saying AI is old hat and be on the lookout for AGI in the next year or two.
 

Alan Edward Klein

Member
Allowing Ads
Joined
Aug 29, 2017
Messages
10,202
Location
New Jersey formerly NYC
Format
Multi Format
Thanks for referring to me as exceptional Alan :smile:.
By the way, the saying is "The exception that proves the rule", and in that saying the use of "proves" is a relatively archaic one that we rarely see now - it essentially means "tests", and is most often now encountered in the context of alcohol. We refer to "80 proof rum" when we reference rum that tests out as 40% alcohol by volume.
To a certain extent Alan I think you are greatly influenced by the fact that you historically used slide film and projected it. Those of us who mainly shot slides tended to manipulate their images less, whereas those who did a lot of work printing in darkrooms tended to be more likely to manipulate their images.
I've spent a lot of time in both camps.
I can assure you that, outside of lab environments, manipulation at the time of printing has always been common. That includes newspaper work. Manipulation doesn't necessary add untruth - it usually adds emphasis or delineation. But it can certainly be used to add mystery or fancifulness.
The creation of montages - assemblies of multiple images into one - have been common for years.
I don't have a lot of that in my history, but there is a little.
I should see if I can find my published work that does that.
I can probably more easily find some of my wedding work where I dd that - although that was always done in camera. You know, the bride and groom imaged in a brandy snifter with lighted candle and wedding invitation at the side.

I used "makes the rule" instead of "proves the rule" to avoid being too dogmatic. :smile:

The NY Times has a rule about photoshopping that regular adjustments for exposure and contrast, and cropping within limits are acceptable, but cloning things in or out are unacceptable. It doesn't have to do with slides or digital capture, although I admit that shooting slides and getting film developed and printing outside has made me personally more of a traditionalist.

What wedding photographers do is between them and their clients and doesn't apply to the point I'm making. Pretty soon, I can see AI making a whole wedding album just from snaps of the bride and groom taken before the wedding. The photographer won't even have to go to the affair.
 
Photrio.com contains affiliate links to products. We may receive a commission for purchases made through these links.
To read our full affiliate disclosure statement please click Here.

PHOTRIO PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Ilford ADOX Freestyle Photographic Stearman Press Weldon Color Lab Blue Moon Camera & Machine
Top Bottom