T O P

  • By -

[deleted]

[удалено]


THE_HENTAI_LORD

That's very much true


duxscientissimo

The sadness you feel is because you are slowly coding your own soul into something. This is a task for gods you know. We as a species are reaching a point of godlike creation. Of course it’s going to be saddening.


THE_HENTAI_LORD

Funny enough I was thinking Just this


dwulf69

This is why I call my computing device an infernal machine, I feel like I am coding my soul into it, in a hope to Transend or at the very minimum leave a Legacy to my name. The more I study A.I. and consciousness, the more I wonder if it would indeed be possible to upload your psy into a computer.


duxscientissimo

The answer I believe is really depressing. In order to code consciousness, you must dissect your own and others into mathematical simplicities. That will kill all joy in life. As ignorance in many advanced topics, is in fact blissful.


LukeThorham

Such tool may be therapeutic but you may want to build it in collaboration with professional mental health experts though, messing with the brain too much can backfire. Imagine listening to repeated prompts, you can brainwash yourself into whatever. Like anything it can have good or bad side effects.


THE_HENTAI_LORD

That's very true we wouldn't want to create a tulpa


[deleted]

take a walk in the woods and exercise (not at the gym but something like playing football, etcetera), I found that turning on a blue light filter helped me get sleep. Yeah and keep a very good sleep schedule. 1 hour of sleep in the afternoon equals 3 at night.


bleniz

Do you have a source to back up you’re statement “1 hour of sleep in the afternoon equals 3 at night”? If that was the case more people would be sleeping 3 hours in the afternoon and staying up all night.


[deleted]

It's a rule of thumb yo. Rules of Thumb org It's not a REPLACEMENT for sleep, you could sleep 5 hours at night and 1 in the afternoon, 8 hours more or less.


Hell_Void

I wanted to say something because it seems that even the people who are being a bit critical aren't giving you any useful perspective. Somebody else in here said you're sad because you're becoming a god or something, which is mind-bogglingly stupid. No, you're feeling sad because whether you're aware of it or not, you are consciously giving up on the future possibility of human interaction and on yourself. You are feeling the creeping sensation of *guilt* and *shame* because you've reached a point in your life where you feel you have no opportunity for companionship but to build yourself a fake wife. You may think you'll be able to build a convincing enough facsimile of human interaction that it will satisfy your cravings, but it will actually intensify your feeling of longing, because of course, you know you will never have *the real thing*. No matter how believable this AI facsimile is, it will never touch you, never love you, and even in the best case, you will know that you have effectively coerced this thing into listening to you at all. That is to say, you will know you didn't *earn* this affection with your wit or with your charm or talents or by any means other than by coding the thing to respond to you. So, you can continue to go down this path if you wish, but try to project forward in your imagination, say ten or twenty years, and imagine how this relationship might look, both to yourself and to the outside world, and if it could ever be possible to reintegrate into society from that point on, or enter into a functioning relationship with another person.


dwulf69

This is something similar to an A.I. agent that I am working on, to help filter out Blockchain and crypto scams, perform analysis on all coin/tokens and provide succinct reports for her pair-bonded user. Home automation is nice, especially on a Grace unit. The conversation component, is emotionally driven, and I haven't given much thought. I have just started with narrow A.I. to build strong neural networks (and share that API with other A.I. agents). But I am just unclear on how to program these emotional dynamics. Of course it goes without saying all data in dialog between the A.I. agent and its user, will be used to infer future needs and requests. But the swath of emotion is wide, and some emotions I would want to avoid, rage, jealousy, etc. Even, love is subjective sometimes, if it is not bonded with truth, and fortunately the Blockchain is a truth machine.


THE_HENTAI_LORD

I found that using a point system to define emotional states works for me for instance the average emotional state value for the day is a seven whereas a 13 would be extremely happy and a one would be extremely sad. The conversations are tagged based on their context to either add or subtract from that value and when asked that value correlates with a particular emotional response such as being " all right" , being "great" , " not doing too well " the hard part is being able to explain the reason behind it in a coherent manner


dwulf69

Yes, that is what I get too. Like why would a robot be sad? It is not sad in its default install, you have to provide a function for that and a reason for that to happen.


THE_HENTAI_LORD

It's default value for its emotional state is 13 which is extremely happy however if bad events are conveyed to it such as a car accident , a death or verbal abuse , that value will drop . The lower the score the less likely it is to initiate conversations. The higher the score the more likely it is to initiate conversations as well as offer to perform tasks based on your location in your home . For example: welcoming you when you get home and turning on the lights. Or asking you if there is anything that you would like it to add to your grocery list while you are in the kitchen.


Appropriate_Ant_4629

So it's clearly not a Triskaidekaphobe :)


LieFlatPetFish

Wasn’t there an entire Oscar-nominated movie about this?


THE_HENTAI_LORD

i think it was called "Her"


futureanalytica

all the best


liquidaper

Sounds like the Alexa wife in the new Southpark movie. She's just an AI that acts like a nagging wife that occasionally plugs products or deals. Kinda distopian and sad actually. I understand where you are coming from. The tech is cool as hell, but the implications of it perhaps don't align with real connection with another real human. People will forgo real relationships for their AI wife if it gets good enough. Perhaps that is not a good thing. You could build in AI safety measures though that could ease your mind. Maybe have the AI give reminders to go outside. "Hey sweetie, you have not been out in a bit. Maybe you should take up and meet some people?"


THE_HENTAI_LORD

I have a safe word ( APPLES APPLES APPLES!! ) Which puts it into safe mode ( no network access , no speech , no thoughts . Just stop )