Unble to locate an element in robocorp vs extension

unble to locate an element in robocorp vs extension, I’ve a login page where I got 2 unique locators with name and id

name=“email” and id=“email”, robot is getting failed I tried all the possible ways to locate the element.

If there is two elements with same id, then that page is probably broken. Ids should be unique.

But can you tell us what library are you using. And show also your code that gives us idea what is going on. Also your conda.yaml and robot.yaml could also give us some more understanding.

here’s my tasks robot file
*** Settings ***

Documentation Template robot main suite.

Library RPA.Browser.Selenium auto_close=${FALSE}

Library RPA.FTP

Library RPA.Desktop.Windows

*** Tasks ***

Minimal task

Log    Done.

Open the website

Open website

Login to Application

#P Close Browser

*** Keywords ***

Open website

Open Available Browser    https://anz.aus-dev.com

Set Selenium Speed    10seconds

Find Element id:email

Login to Application

Click Element //*[@name=“email”]

#Input Text When Element Is Visible    name:password    text

Input Text When Element Is Visible     id:'email'    testing

Im able to launch the browser successfully and also am able to navigate to the url but unable to locate the element.

here’s my conda.yaml and robot.yaml file



















Run all tasks:

shell: python -m robot --report NONE --outputdir output --logtitle "Task log" tasks.robot

condaConfigFile: conda.yaml

artifactsDir: output


  • .


  • .


  • .gitignore

Can you please add/edit those files as “preformatted text” into your message. It is very hard to read now when it is pasted as forum text.

Also I think your conda.yaml is not there (it seems to be .gitignore file or something similar).

And I still ask, do you have some other VS code “Robot Framework” plugins installed? Seem that something is causing your VS Code to become confused about what it should do with “robot.yaml” for example. If you have multiple plugins installed, that will cause weird behaviour, so I would advice you to disable those first and then see if you can cleanly run robot using our plugins only.