One area that will require more development for linked data to truly take hold in the library community and beyond is automating the creation of links between locally created data sets and large linked data sets such as VIAF, LCNAF, and DBpedia. These types of links are critical for leveraging the true potential of linked data but are currently rather difficult to create accurately and efficiently. Though the NCSU project team had success with staff members manually searching for URIs, this approach would not be scalable to larger projects,which could involve tens of thousands of URIs. As linked data sets grow larger, there will be an increasing need for more sophisticated linked data reconciliation tools or services similar to the authority control processing currently offered by vendors such as Marcive, Backstage, LSSI, and LTI, where libraries could submit RDF data and the vendor would return an enhanced data set with links to equivalent entities in other popular linked data sets.