Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BING CHAT has too many limitations and ethics #65

Open
IntelligenzaArtificiale opened this issue May 4, 2023 · 1 comment
Open

BING CHAT has too many limitations and ethics #65

IntelligenzaArtificiale opened this issue May 4, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@IntelligenzaArtificiale
Copy link
Owner

We have recently implemented the BING CHAT API.

The problem is that unlike ChatGPT it refuses to perform tasks, or to play role-playing games.

With AUTOGPT.PY he sometimes works very well, almost scary, other times he refuses to perform tasks (such as writing articles or other tasks that he deems not suitable for him)

Instead on BabyAGI and Camel , it refuses to perform the tasks. Because they are based on role-playing PromptTemplate . And bingchat is set to not execute these role-play task.

Any suggestions to buy pass the problem?

@IntelligenzaArtificiale IntelligenzaArtificiale added the enhancement New feature or request label May 4, 2023
@mithras666
Copy link

Hmm, I've heard that Bing reacts negatively to messages with accusative/imperative tone (Using "you must" or "do this").

I wonder if we could add some sort of randomized jailbreak prompt to the default Bing prompts.

Be careful with that though because Microsoft is banning people for using the same jailbreak prompt many times, so if you include it in your project, it might get many people banned at the same time!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants