How do I *really* know that any other humans besides myself are conscious, given that I have no direct insight into their minds?
you don't, and that's an interesting thought for sure, but it's not necessarily related. All you know is that you exist in some way. Everything else, you can't know. But let's assume that other people have internal feelings.
what bugs me about the idea of a program that does this is, eh, difficult to explain. basically, it's the fact that you're deciding a yes/no thing with a fluent process. Say you have a software that controls, dunno, a tank, and it has 3 million lines of code and is hugely complex. This software feels nothing. It doesn't feel very little, it feels
nothing. But an AI that does have feelings, well it has feelings. so, where do these begin? and say you're programming it, it's a software, it consists out of lines of code. If you start writing, there are no feelings. But at some point, it must be able to feel (obviously). So, this comes down to... an additional line of code you're writing? one line of code makes the difference between a program that feels and one that doesn't? what? but it has to, because they must be there after a specific line. they cannot
not be there after one line, because
everything you do is write a lot of single lines.
not sure if my thoughts here are understandable, it's difficult to explain them correctly. but the idea to decide a yes/no thing like that with a fluent process doesn't make sense to me.