What are the ethical challenges related to whole brain emulation?

Unless there were a way to cryptographically ensure otherwise, whoever were to run such an emulation would have control over their environment and could reset them to any state they were previously in. This opens up the possibility of powerful interrogation and torture of digital people.

Imperfect uploading might lead to damage that causes the em to suffer while still remaining useful enough to be run, for example, as a test subject for research. We would also have greater ability to modify digital brains. Edits done for research or economic purposes might cause suffering. See this fictional piece for an exploration of how a world with a lot of em suffering might look like.

These problems are exacerbated by the likely outcome that digital people can be run at a speed much faster than that of biological humans. As a result, an em could undergo hundreds of subjective years in minutes or hours without checks on its wellbeing.

The safety problems related to whole brain emulations are both with the process of uploading and when uploaded.

When uploading it's important to have the technology to transfer what makes up a person's mind, since there is a difference between a copy of the mind and an identical mind. When uploading the mind a risk might be creating a philosophical zombie, who can act like the person that was uploaded, while not being identical in all aspects. Whether the brain emulation is a philosophical zombie or not, there are questions about the legal personhood of emulations and how the brain emulation is in relation to the person or its relatives. This can cause a conflict of interest, for example whether the brain emulation could decide that it's time to pull the plug on the person, if sick.

After being uploaded, computer viruses or malware might be able to change or erase brain emulations, including forcing them into experiments; this could also be used as a ransom.