AMERICAN EDUCATIONAL RESEARCH ASSOCIATION, INC. et al v. PUBLIC.RESOURCE.ORG, INC.

Filing 60

MOTION for Summary Judgment Filed by AMERICAN EDUCATIONAL RESEARCH ASSOCIATION, INC., AMERICAN PSYCHOLOGICAL ASSOCIATION, INC., NATIONAL COUNCIL ON MEASUREMENT IN EDUCATION, INC. (Attachments: #1 Statement of Facts Points of Authority, #2 Statement of Facts Statement of Undisputed Facts, #3 Declaration Declaration of Jonathan Hudis, #4 Exhibit Ex. A, #5 Exhibit Ex. B, #6 Exhibit Ex. C, #7 Exhibit Ex. D, #8 Exhibit Ex. E, #9 Exhibit Ex. F, #10 Exhibit Ex. G, #11 Exhibit Ex. H, #12 Exhibit Ex. I, #13 Exhibit Ex. J, #14 Exhibit Ex. K, #15 Exhibit Ex. L, #16 Exhibit Ex. M, #17 Exhibit Ex. N, #18 Exhibit Ex. O, #19 Exhibit Ex. P, #20 Exhibit Ex. Q, #21 Exhibit Ex. R, #22 Exhibit Ex. S, #23 Exhibit Ex. T, #24 Exhibit Ex. U, #25 Exhibit Ex. V-1, #26 Exhibit Ex. V-2, #27 Exhibit Ex. W, #28 Exhibit Ex. X, #29 Exhibit Ex. Y, #30 Exhibit Ex. Z, #31 Exhibit Ex. AA, #32 Exhibit Ex. BB, #33 Exhibit Ex. CC, #34 Exhibit Ex. DD, #35 Exhibit Ex. EE, #36 Exhibit Ex. FF-1, #37 Exhibit Ex. FF-2, #38 Exhibit Ex. FF-3, #39 Exhibit Ex. FF-4, #40 Exhibit Ex. FF-5, #41 Exhibit Ex. FF-6, #42 Exhibit Ex. GG, #43 Exhibit Ex. HH, #44 Exhibit Ex. II, #45 Exhibit Ex. JJ, #46 Exhibit Ex. KK, #47 Exhibit Ex. LL, #48 Exhibit Ex. MM, #49 Declaration Declaration of Marianne Ernesto, #50 Exhibit Ex. NN, #51 Exhibit Ex. OO, #52 Exhibit Ex. PP, #53 Exhibit Ex. QQ, #54 Exhibit Ex. RR, #55 Exhibit Ex. SS, #56 Exhibit Ex. TT, #57 Exhibit Ex. UU, #58 Exhibit Ex. VV, #59 Exhibit Ex. WW, #60 Exhibit Ex. XX, #61 Exhibit Ex. YY, #62 Exhibit Ex. ZZ, #63 Exhibit Ex. AAA, #64 Exhibit Ex. BBB, #65 Exhibit Ex. CCC, #66 Exhibit Ex. DDD, #67 Exhibit Ex. EEE, #68 Exhibit Ex. FFF, #69 Exhibit Ex. GGG, #70 Exhibit Ex. HHH, #71 Exhibit Ex. III, #72 Exhibit Ex. JJJ, #73 Declaration Declaration of Lauress Wise, #74 Exhibit Ex. KKK, #75 Exhibit Ex. LLL, #76 Declaration Declaration of Wayne Camara, #77 Exhibit Ex. MMM, #78 Declaration Declaration of Felice Levine, #79 Exhibit Ex. NNN, #80 Exhibit Ex. OOO (Public Version), #81 Exhibit Ex. PPP, #82 Exhibit Ex. QQQ, #83 Exhibit Ex. RRR, #84 Exhibit Ex. SSS, #85 Exhibit Ex. TTT-1, #86 Exhibit Ex. TTT-2, #87 Exhibit Ex. UUU, #88 Declaration Declaration of Kurt Geisinger, #89 Declaration Declaration of Dianne Schneider, #90 Text of Proposed Order Proposed Order, #91 Certificate of Service Certificate of Service)(Hudis, Jonathan). Added MOTION for Permanent Injunction on 12/22/2015 (td).

Download PDF
UNITED STATES DISTRICT COURT FOR THE DISTRICT OF COLUMBIA AMERICAN EDUCATIONAL RESEARCH ASSOCIATION, INC., AMERICAN PSYCHOLOGICAL ASSOCIATION, INC., and NATIONAL COUNCIL ON MEASUREMENT IN EDUCATION, INC., Plaintiffs, v. PUBLIC.RESOURCE.ORG, INC., Defendant. ) ) ) ) ) ) ) ) ) ) ) ) ) ) Civil Action No. 1:14-cv-00857-TSC-DAR DECLARATION OF WAYNE CAMARA IN SUPPORT OF PLAINTIFFS’ MOTION FOR SUMMARY JUDGMENT AND ENTRY OF A PERMANENT INJUNCTION I, WAYNE J. CAMARA, declare: 1. I am the Senior Vice President, Research at ACT. My company produces and publishes the ACT® college readiness assessment — a college admissions and placement test taken millions of high school graduates every year. ACT also offers comprehensive assessment, research, information, and program management services to support education and workforce development. As the Senior Vice President of Research, I am responsible for all research and evidence related to the design, development, use, and validation of our assessments and programs. In my position, I serve on the Senior Leadership Team and manage over 110 researchers. 2. I submit this Declaration in support of the motion of the American Educational Research Association, Inc. (“AERA”), the American Psychological Association, Inc. (“APA”), and the National Council on Measurement in Education, Inc. (“NCME”) (collectively, “Plaintiffs” or “Sponsoring Organizations”) for summary judgment and the entry of a permanent injunction. 3. Prior to working at ACT, I worked at The College Board, where I held the -1- positions of Vice President, Research and Development (July, 2000 – September, 2013), Executive Director, Office of Research and Development (March, 1997 – June, 2000), and Research Scientist (September, 1994 – February, 1997). 4. Before working at The College Board, I worked for APA in the positions of Assistant Executive Director for Scientific Affairs and Executive Director of Science (19921994), Director, Scientific Affairs (February, 1989 – August, 1992), and Testing and Assessment Officer (November, 1987 – January, 1989.) During my time at APA, I also served as the Project Director for the revision of the 1985 edition of the Standards for Educational and Psychological Testing published in 1999 (the “1999 Standards”). In 1997, I was elected to APA’s Council of Representatives, and I served on the Council from 1997-2003. In April, 2012, I was elected to the AERA Council, serving from April, 2012 to April, 2015 as Vice President for Division D. I was also elected to NCME’s Board of Directors, serving on the Board from 2002-2005 and 2009-2012, and served as NCME’s President from 2010-2011. Additionally, I have served on the Management Committee for the Standards from 2005-2015. 5. My curriculum vitae is attached to this Declaration as Exhibit 1. 6. I have written extensively on the Standards, as well as other professional and technical guidelines which relate to educational and industrial testing and assessment, including journal articles, book chapters, and paper presentations at national conferences. 7. In 1954, APA prepared and published the “Technical Recommendations for Psychological Tests and Diagnostic Techniques.” In 1955, AERA and NCME prepared and published a companion document entitled, “Technical Recommendations for Achievement Tests.” Subsequently, a joint committee of the three organizations modified, revised, and consolidated the two documents into the first Joint Standards. Beginning with the 1966 revision, -2- the Sponsoring Organizations collaborated in developing the “Joint Standards” (or simply, the “Standards”). Each subsequent revision of the Standards has been careful to note that it is a revision and update of the prior version. 8. Beginning in the mid-1950s, the Sponsoring Organizations formed and periodically reconstituted a committee of highly trained and experienced experts in psychological and educational assessment, charged with the initial development of the Technical Recommendations and then each subsequent revision of the (renamed) Standards. These committees were formed by the Sponsoring Organizations’ Presidents (or their designees), who would meet and jointly agree on the membership. Often a chair or co-chairs of these committees were selected by joint agreement. Beginning with the 1966 version of the Standards, this committee became referred to as the “Joint Committee.” 9. Financial and operational oversight for the Standards’ revisions, promotion, distribution, and for the sale of the 1999 and 2014 Standards has been undertaken by a periodically reconstituted Management Committee, comprised of designees of the three Sponsoring Organizations. As noted above, I served on this Management Committee from 20052015. 10. All members of the Joint Committee(s) and the Management Committee(s) are unpaid volunteers. The expenses associated with the ongoing development and publication of the Standards include travel and lodging expenses (for the Joint Committee and Management Committee members), support staff time, printing and shipment of bound volumes, and advertising costs. 11. From the time of their initial creation to the present, the preparation of and periodic revisions to the Standards entail intensive labor and considerable cross-disciplinary -3- expertise. Each time the Standards are revised, the Sponsoring Organizations select and arrange for meetings of the leading authorities in psychological and educational assessments (known as the Joint Committee). During these meetings, certain Standards are combined, pared down, and/or augmented, others are deleted altogether, and some are created as whole new individual Standards. The 1999 version of the Standards is nearly 200 pages, took more than five years to complete, and is the result of work put in by the Joint Committee to generate a set of best practices on educational and psychological testing that are respected and relied upon by leaders in their fields. 12. Draft revisions of the 1985 Standards, for what became the 1999 Standards, were widely distributed for public review and comment during the revision process. The Joint Committee received thousands of pages of comments and proposed text revisions from: the membership of the Sponsoring Organizations, scientific, professional, trade and advocacy groups, credentialing boards, state and federal government agencies, test publishers and developers, and academic institutions. While the Joint Committee reviewed and took under advisement these helpful comments, the final language of the 1999 Standards was a product of the Joint Committee members. When the 1985 Standards were revised, more than half the content of the 1999 Standards resulted from newly written prose of the Joint Committee. 13. The Standards originally were created as principles and guidelines – a set of best practices to improve professional practice in testing and assessment across multiple settings, including education and various areas of psychology. The Standards can and should be used as a recommended course of action in the sound and ethical development and use of tests, and also to evaluate the quality of tests and testing practices. Additionally, an essential component of responsible professional practice is maintaining technical competence. -4- Many professional associations also have developed standards and principles of technical practice in assessment. The Sponsoring Organizations’ Standards have been and still are used for this purpose. 14. The Standards, however, are not simply intended for members of the Sponsoring Organizations, AERA, APA, and NCME. The intended audience of the Standards is broad and cuts across audiences with varying backgrounds and different training. For example, the Standards also are intended to guide test developers, sponsors, publishers, and users by providing criteria for the evaluation of tests, testing practices, and the effects of test use. Test user standards refer to those standards that help test users decide how to choose certain tests, interpret scores, or make decisions based on tests results. Test users include clinical or industrial psychologists, research directors, school psychologists, counselors, employment supervisors, teachers, and various administrators who select or interpret tests for their organizations. There is no mechanism, however, to enforce compliance with the Standards on the part of the test developer or test user. The Standards, moreover, do not attempt to provide psychometric answers to policy or legal questions. 15. The Standards promote the development of high quality tests and the sound use of results from such tests. Without such high quality standards, tests might produce scores that are not defensible or accurate, not an adequate reflection of the characteristic they were intended to measure, and not fair to the person tested. Consequently, decisions about individuals made with such test scores would be no better, or even worse, than those made with no test score information at all. Thus, the Standards help to ensure that measures of student achievement are relevant, that admissions decisions are fair, that employment hiring and professional credentialing result in qualified individuals being selected, and patients with psychological needs are diagnosed properly and treated accordingly. Quality tests protect the public from harmful -5- decision making and provide opportunities for education and employment that are fair to all who seek them. 16. The Standards apply broadly to a wide range of standardized instruments and procedures that sample an individual’s behavior, including tests, assessments, inventories, scales, and other testing vehicles. The Standards apply equally to standardized multiple-choice tests, performance assessments (including tests comprised of only open-ended essays), and hands-on assessments or simulations. The main exceptions are that the Standards do not apply to unstandardized questionnaires (e.g., unstructured behavioral checklists or observational forms), teacher-made tests, and subjective decision processes (e.g., a teacher’s evaluation of students’ classroom participation over the course of a semester). 17. The Standards have been used as a source in developing testing guidelines for such activities as college admissions, personnel selection, test translations, test user qualifications, and computer-based testing. The Standards also have been widely cited to address technical, professional, and operational norms for all forms of assessments that are professionally developed and used in a variety of settings. The Standards additionally provide a valuable public service to state and federal governments as they voluntarily choose to use them. For instance, each testing company, when submitting proposals for testing administration, instead of relying on a patchwork of local, or even individual and proprietary, testing design and implementation criteria, may rely instead on the Sponsoring Organizations’ Standards to afford the best guidance for testing and assessment practices. 18. The Standards were not created or updated to serve as a legally binding document, in response to an expressed governmental or regulatory need, nor in response to any legislative action or judicial decision. However, the Standards have been cited in judicial decisions related -6- to the proper use and evidence for assessment, as well as by state and federal legislators. These citations in judicial decisions and during legislative deliberations occurred without any lobbying by the Plaintiffs. 19. The Sponsoring Organizations do not keep any of the revenues generated from the sales of the Standards. Rather, the income from these sales is used by the Sponsoring Organizations to offset their development and production costs and to generate funds for subsequent revisions. This allows the Sponsoring Organizations to develop up-to-date, high quality Standards that otherwise would not be developed due to the time and effort that goes into producing them. 20. At one time, funding for the Standards revision process from third party sources (e.g., governmental agencies, foundations, other associations interested in testing and assessment issues, etc.) was raised as a consideration. However, this option was not seriously explored as the potential conflicts of interest in doing so left the Sponsoring Organizations to conclude that the Standards revisions should be self-funding – that is, from the sale of prior editions of the Standards. 21. In late 2013 and early 2014, the Sponsoring Organizations became aware that the 1999 Standards had been posted on the Internet without their authorization, and that psychology students were obtaining free copies from the posting source. Upon further investigation, the Sponsoring Organizations discovered that Public.Resource.Org, Inc. (“Public Resource”) was the source of the online posting. Accompanying this Declaration as Exhibit MMM is a true copy of a thread of emails exchanged among Laurie Wise, Suzanne Lane, David Frisbie, Jerry Sroufe, Marianne Ernesto, Barbara Plake, and myself1 sent between December 16, 2013 and February 4, 1 Laurie Wise is the Immediate Past President of NCME and was serving as President of NCME at the time of the email, Suzanne Lane is a member of the Standards Management Committee: , David Frisbie also is a member of the -7- 2014, discussing Public Resource’s posting of the 1999 Standards on the Internet, and marked as Exhibit 1185 during my deposition. 22. Past harm to the Sponsoring Organizations from Public Resource’s activities includes a lack of greater funding that otherwise would have been available for the update of the Sponsoring Organizations’ Standards from the 1999 to the 2014 versions, due to the reduced volume of sales of the 1999 Standards. 23. Should Public Resource’s infringement be allowed to continue, the harm to the Sponsoring Organizations, and public at large who rely on the preparation and administration of valid, fair and reliable tests, includes: (i) uncontrolled publication of the 1999 Standards without any notice that those guidelines have been replaced by the 2014 Standards; (ii) future unquantifiable loss of revenue from sales of authorized copies of the 1999 Standards (with proper notice that they are no longer the current version) and the 2014 Standards; and (iii) lack of funding for future revisions of the 2014 Standards and beyond. 24. Due to the small membership size of NCME, and the relative minor portion of the membership of AERA and APA who devote their careers to testing and assessment, it is highly unlikely that the members of the Sponsoring Organizations will vote for a dues increase to fund future Standards revision efforts if Public Resource successfully defends this case and is allowed to post the Standards online for the public to download or print for free. As a result, the Sponsoring Organizations would likely abandon their practice of periodically updating the Standards and there would be an absence of any authoritative and independent source of sound guidance relating to the development, use, and evaluation of psychological and educational tests. Standards Management Committee; Jerry Sroufe is the Director of Government Relations at AERA, Marianne Ernesto is the Director, Testing and Assessment, at APA, and Barbara Plake was Laurie Wise’s co-chair of the Joint Committee for the revision of the 1999 Standards, which ultimately were published in 2014. -8- Dated: December 8, 2015 Wayne J. Camara -9- EXHIBIT 1 WAYNE J. CAMARA OFFICE: ACT 500 ACT Drive Iowa City, IA 52243-0168 Tel (319) 337-1869 wayne.camara@act.org HOME: 81 Lewis St., Marion, MA 02738 EDUCATION: Ph.D. 1986 University of Illinois at Urbana, Psychology. Educational Measurement Cognate: Industrial and Organizational Psychology C.A.G.S. 1982 Rhode Island College (School Psychology), Providence, R.I. M.A. 1980 Rhode Island College (Educational Measurement), Providence, R.I. B.A. 1978 University of Massachusetts (Psychology/Education), N. Dartmouth, MA PROFESSIONAL EXPERIENCE: ACT, Iowa City, IA Senior Vice President, Research (September 2013 - ) Oversees research departments across education and workforce assessments and services related to research, psychometrics, data reporting, statistical analysis, policy research, survey development, and industrial psychology services (e.g., job profiling). Manage a staff of over 125 professional staff. Serves on ACT’s strategic leadership team and is responsible for shaping and guiding organizational direction and planning, as well as representing the organization with external audiences and stakeholders in areas including accountability, research, admissions testing, etc. Member of the Executive Leadership Team and business sponsor on a range of technology and development projects. The College Board, New York, NY Vice President, Research and Development (July, 2000 – September 2013) Executive Director, Office of Research and Development (March, 1997 - Present) Research Scientist (Sept. 1994 - Feb., 1997) Was responsible for all research, standards and alignment services, psychometric and assessment development activities at the College Board, including design and implementation of R&D activities that support College Board assessment programs (SAT, PSAT/NMSAT, AP, CLEP, Subject Tests, Accuplacer, SpringBoard, etc.). Managed a staff of approximately 75 professionals across several units and locations: Research, Statistics and Psychometrics, Test Development, Analysis and Reporting, and Standards Alignment. Responsible for policy research, outreach with state assessment directors, higher educational institutions, state Boards of Education and other policy and governance bodies. Coordinate product planning and business planning for new assessments and enhancements to current assessments. Responsible for several external advisory panels and test development committees. Responsible for reporting SAT aggregate results to institutions, reviewing all items and final forms of the SAT, and other W. Camara 2 operational work related to assessment development and delivery. Serves as a spokesperson for the College Board on technical and assessment policy discussions with the media, policymakers (e.g., testimony), institutions and other key stakeholders. Works with states, districts, policy makers and higher educational systems, to provide data, analyses and information concerning student achievement and college readiness. Directed data release process, guidelines and approvals. Project manager for development of the New SAT and represents the College Board on issues of test development and research with universities, higher educational associations, states and districts, academic associations and other groups. Responsible for hiring / management of vendors and academicians to implement research, review test forms and items, and prototype development. Specific areas of research include validity of admissions measures, evaluation of educational programs, include effects of accommodations and extended time for examines with disabilities, meta-analysis of SAT validity, grade inflation trends, and development of new constructs and measures relevant to an expanded predictor - criterion space. Selected development efforts: (1) Conceived, developed and conducted research resulting in AP Potential which is increased access to AP courses by identifying students with potential for success; (2) 2005 SAT redesign with writing; (3) Implementation of ECD and AP Redesign work in selected courses; (4) Research and transition plan to move AP, SAT and PSAT from formula scoring to rights only scoring; (5) Design on AP Portfolio and through course pilot; (6) Accuplacer diagnostic tests and replatforming; (7) Plan to migrate most research and selected psychometric operational work to the CB from vendors; and (8) Design of CLEP-testlet assessment. AMERICAN PSYCHOLOGICAL ASSOCIATION, Washington, D.C. Assistant Executive Director for Scientific Affairs and Executive Director of Science (1992 -1994.) Director, Scientific Affairs (February 1989 - Aug., 1992) Testing and Assessment Officer (Dec. 1987 - Jan. 1989) Project Director for the Revision of the Standards for Educational and Psychological Testing and Assessment. Managed the technical committee, various technical panels, a financial management committee and an executive committee comprised of the Presidents of APA, AERA, and NCME. Coordinated and developed all association policies and guidelines in areas of scientific affairs, scientific misconduct, research funding, and testing and assessment. Major area of responsibilities in measurement and assessment included: (a) monitoring scientific and technical advances; (b) educating policy makers, the public, the media, and other professionals (e.g., employers, educators) of the relevance and appropriate applications of assessment; (c) developing technical guidance and policy statements that address new and emerging areas, reflecting both the scientific and professional consensus in assessment; (d) working collaboratively with other professional associations, advocacy groups, and governmental agencies; and (e) testimony and advocacy on the efficacy of behavioral science. Directed APA involvement in numerous assessment issues at the national level: SCANS, Americans with Disabilities Act, national education standards, industry-based skills standards, Civil Rights Act of 1991, efficacy of clinical assessment, integrity testing, and test-based accountability initiatives. Assisted in developing amicus briefs for Supreme Court, informing policymakers, media, and the public of technical advances in assessment (e.g., validation strategies, computer-based and interactive assessments, implications of fairness and utility analyses, etc.) and behavioral science research more broadly. W. Camara 3 GEORGE WASHINGTON UNIVERSITY, Washington DC Adjunct Professor of Administrative Sciences and College of Business (1988 - 1994) Taught graduate seminars in training, performance evaluation, personnel selection, and organizational behavior. Served on several doctoral dissertations in I-O psychology. HUMAN RESOURCES RESEARCH ORGANIZATION, Alexandria, VA, Senior Scientist (Feb. 1987 - November 1987), Research Scientist (August 1985 - Feb. 1987) Conducted research and managed grants and proposal development in areas of job analysis, competency modeling, military testing, training, and personnel selection. Projects including: Investigated the utility of algorithms used in computer-based job classification systems employed by each branch of the military service. Developed a crosswalk between military occupations in each service branch and civilian occupations. Project Director and Principal Investigator for contracts funded by the Assistant Secretary of Defense and the Navy Personnel Research and Development Center to conduct a longitudinal evaluation of the impact of military training and service on subsequent employment/social success of low aptitude youth enlisted in the military. Developed training and career development system for first-line civilian supervisors in the U.S. Army. Provided recommendations for the career development and training of future and incumbent Army civilian first-line supervisors. Developed training evaluation instruments and conducted evaluation of counselor training in the use and interpretation of the ASVAB. Managed and conducted several job analysis projects for military and civilian occupations with the Department of Defense. PERSONNEL SERVICES OFFICE, UNIVERSITY OF ILLINOIS, Champaign, IL Human Resources Consultant (1983-85), Illinois State Civil Service Designed and managed R&D projects including the development of a computerized adaptive screening measure to optimize the matching of jobs and applicants of Civil Service positions. Conducted a largescale job analysis of 70 professional and technical job classifications. Used multiple-rater, multiplemethod job analyses and applied generalizability theory to interpret findings. Performed validation studies of existing civil service exams. DEPARTMENT OF PSYCHOLOGY, UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN Teaching Assistant and Academic Advisor (1983-85). Research Assistant (1984-85). COLLEGE OF EDUCATION, UNIVERSITY OF ILLINOIS Psychological Evaluator (1983-84). Administered, scored, and interpreted a variety of psychological and cognitive measures. BRISTOL COMMUNITY COLLEGE, Fall River, MA Lecturer (Spring 1980 -1982), Psychology and Education W. Camara 4 WEST BRIDGEWATER PUBLIC SCHOOLS, West Bridgewater, MA School Psychologist (1979-82) Chairperson on team evaluations and reviews. Representative for the school district in out-of-district placements, conferences, and regional and state planning meetings. Psychological testing – psychodiagnostic and learning assessment - individual IQ tests, projective testing, special abilities testing, including two years of clinical supervision. Developed detailed assessment and remediation plans for over 150 students. PROFESSIONAL AFFILIATIONS: American Psychological Association - Elected Fellow, 1994 Division of Educational Psychology Division of Evaluation, Measurement, and Statistics – Elected Fellow, 2002 Division of General Psychology – Elected Fellow, 1994 Division of Military Psychology Division of International Psychology – Elected Fellow, 2009 American Psychological Society, Elected Fellow 2007 American Educational Research Association – Elected Fellow 2008 International Association of Applied Psychology International Test Commission 1997 National Association of Collegiate Admissions Counselors National Council for Measurement in Education New York Academy of Sciences – Elected, 2002 Personnel Testing Council of Metropolitan Washington Society for Industrial/Organizational Psychology – Elected Fellow, 1999 RELATED PROFESSIONAL ACTIVITIES AND AWARDS: American Educational Research Association, Division D (Measurement and Research Methodology); Co-Chair, Annual Convention Program, 1998. American Psychological Association: Appointed APA Member of the Joint Committee on Testing Practices, 1997-2000; Elected to Council of Representatives, 1997-2000; Member of Joint Science and Practice Integration Task Force, 1998; Member CODAPAR, 2004-2007. Division of Evaluation, Measurement and Statistics, Member-at-Large, 1997-2000; Chair; Professional Affairs, 1995-96, Member, Program Committee, 1992, 1994. Division of Military Psychology: Chair, Program Committee, 1988. Division of General Psychology: Chair, Membership Committee, 1996; Member-at-Large, 20022005. Associate Editor, Journal of Occupational Health Psychology, 1996 – 1999; Journal of Experimental Psychology – Applied, 2001 – 2007; Advisory Editor, Journal of Educational Measurement 2008 – , Educational Measurement: Issues and Practice, 2012 - . Audit team, Psychometric and measurement graduate program, University of North Carolina at Greensboro, spring, 2006. Author of numerous technical and policy statements approved by the American Psychological Association (e.g., Statement on the disclosure of test data, Statement on the Golden Rule, Resolution on a separate directorate for behavioral sciences at NSF). W. Camara 5 Author, test reviews on numerous employment, organizational and career tests for Buros Mental Measurements Yearbook 1994, 1996, 1997, 1998, 2000, 2005, 2006, 2008. Award for Distinguished Service Contributions, the Society of Industrial and Organizational Psychology, 2004. Award for Professional Contributions and Service to Testing, Association of Test Publishers, 2014. Award (to staff unit) for Dissemination of Educational Measurement Concepts to the Public, Code of Fair Testing Practices in Education National Council on Measurement in Education, 1989 Award for New Product Development, Testlet Design for CLEP, Educational Testing Service, 1998. Board of Advisors, Center for Enrollment Research Policy and Practice, University of Southern California, 2008 – Council of Chief State School Officers, Technical Issues in Large Scale Assessment (TILSA), 2014 Common Core State Standards – Assisted in development and policy oversight in joint effort led by CCSSO and National Governors Association (2009-10). Editorial Board, International Journal of Selection and Assessment (2001 - 06), Military Psychologist (2002 07), Journal of Experimental Psychology: Applied (2001 - 07), Educational Measurement: Issues and Practice (2010 – current), NCME Edited Book Series (2014 – current). Expert in judicial and regulatory proceedings involving cognitive ability testing, accommodations and score comparability in admissions testing, personality testing and disparate impact, job analysis and recruitment practices, affirmative action (Gratz v. Bollinger), and copyright infringement on the Standards for Educational and Psychological Testing. Independent Consultant (selected list), American Council on Education, Goodyear Corporation, American Waterways, Federal Reserve Bank of New York, City University of New York, Maryland State Departments of Education, Army Research Institute, American Institute for Research, US DOE, Tennessee Department of Education, PSI, Wonderlic Inc., employment and labor attorneys and several other organizations in areas of employment testing, educational evaluation, college readiness and standard setting, performance appraisal systems, and survey research. Journal Reviewer: American Psychologist; Educational Measurement: Issues and Practice; Educational Researcher; Personnel Psychology; Psychology, Public Policy and the Law; Journal of Occupational Health Psychology; Journal of Educational Measurement, Applied Educational Measurement, Journal of Educational Measurement, Educational Measurement: Issues and Practice, Journal of Applied Psychology, Human Factors, Military Psychologist, etc. Media experience: Appeared on national and local television and radio (CNN, Good Morning America, BBC, PBS) to discuss the use Civil Rights Act, ADA, personality testing and admissions testing; Frequently quoted in major newspaper stories involving testing, 1992- Present. Member of International Standards Organization (ISO) Working Group on International Standards on Psychological testing (ANSI), 2007 – 2010. W. Camara 6 National Academy of Sciences, Panelist and participant in workshops sponsored by the Board of Testing and Assessment on School-to-Work (1997-98), Collegiate Admissions testing, and Accommodations and flagging test scores for disabled test takers (1997, 2003). National Council on Education in Measurement, Chair, Professional Responsibilities Committee, 1996 - 2000. Chair, Career Award Committees, 2015-2016. Fund Development Committee, 2013-2016 Office of Educational Research and Improvement, Technical Review Committee for grants associated with the National Assessment of Educational Progress, 1992-1997. Society for Industrial and Organizational Psychology: Member of Executive Committee, 1988- 2003; Chair, External Affairs Committee, 1993-95; Chair, Awards Committee, 1991-93; Chair, Membership Committee, 1988-91; Membership Committee, 1987-88; Program Committee 1986-87; 1998-99; Fellowship Committee 2007-10; and Distinguished Service Award 2011-13. Designed membership survey and developed first SIOP membership directory. Standards for Educational and Psychological Testing, AERA, APA and NCME. Staff Director (1992-94); Chair, Management Committee (2005-2015). Standard Setting Approaches and Policy Capturing for College and Career Readiness (Consultation to several states) (2010-current). • • • • • • • NAEP linkages and alignment studies with SAT and Accuplacer (2011-12) and ACT, Explore, Compass (2014-15). STARR end-of-course examinations, Texas Educational Agency (2012) End-of-course tests, Tennessee Department of Education (2011) Achieve Inc. Algebra II examination (2008-10). New York State (2012-13, through College Board contract). Wyoming, Department of Education (2014, ACT contract). South Carolina, Department of Education (2015, ACT contract). Technical or Scientific Advisory Committee Member: • Advisory Panel and Steering Panel, Department of Labor-OERI effort to develop assessments to measure competencies from the Secretary's Commission on Achieving Necessary Skills (SCANS), ACT, Iowa City, IA, 1992-94. • American Association of Medical Colleges, Blue Ribbon Technical Panel on additional measures for admission to medical colleges, 2012 - 2015. • American Diploma Project, Multi-state Algebra assessments, Research Alliance supported by Achieve for 16 states, 2007 – 2010. • American Institute of Certified Public Accountants, Psychometric Oversight Committee, 2011 – 2014. • Army Research Institute for the Behavioral and Social Sciences, Chair Scientific Review Panel on Selection and Classification Program, 2003; Panel member of the technical advisory panel on ABLE, 2001-02. • Congressional Office of Technology Assessment. Technical assistance for study on integrity testing in employment settings, 1989-90, and study on performance assessments in school testing, 1991. • Delaware State Education Department, Chair TAC on Race to the Top 2011-2013. W. Camara • • • • • • • • • • • • • • • 7 Department of Defense, ASVAB testing program, 2000-2008, (chair 2002-2008). International Standards Organization, ISO Standard 10667 (organizational assessment), U.S. team on international development committee 2009-2012. Law School Admissions Council (chair), Technical Audit Team 2009. Metametrics, Technical Advisory Committee, 2013-2016. National Assessment of Educational Progress (NAEP), College freshmen technical panel, 2009 – 2010; Technical Advisory Committee on Standard Setting (Writing), 2010-2014; Advisory panel on survey of higher educational institutions on use of assessments for College Readiness and Placement, 2011-12. NCAA Data and Analysis Research Group, 2005-2008. Nebraska State Department of Education (TAC) 2008 – 2013. PARCC – Technical Advisory Committee, 2010 – 2013. Pearson Test of English, Scientific Advisory Committee, 2009 – 2013. Pennsylvania State Department of Education (TAC) 2003 - current Psychological Services Inc., employment-certification testing, Scientific Advisory Board, 2011 current Texas State Department of Education (TAC) 2011 - current Technical Advisor Reporting Jointly to Texas Educational Authority and Texas Higher Education Coordinating Board 2008. U.S. Department of Education, National Advisory Technical Panel on NCLB Reporting, 2008-11. U.S. Department of Labor, Technical advisor, National Job Analysis and Skills Assessment, 1993-96. U.S. Congress Office of Technology Assessment: Reviewer and panelist, Making the ADA work for people with psychiatric disabilities in the workplace, 1993. Workshop presenter in areas of testing, employment selection and litigation, testing and public policy, ADA, work-based learning, testing standards, SIOP Principles, diversity in admissions, ethics in assessment, predictive validity, admissions testing, higher educational assessment, and research funding at regional applied I-O meetings and conferences. ELECTED POSITIONS: American Educational Research Association, Division D (Measurement and Research Methodology), Vice President and Council Representative, 2012-2015. American Psychological Association: Council of Representatives, 1997-2003 (SIOP). Association of Test Publishers: Board of Directors 2004 – 2008; Chair 2007; Treasurer 2008-2010. Division of Evaluation, Measurement, and Statistics, American Psychological Association: President, 200001, Member-at-Large, 1997-2000. Division of General Psychology, American Psychological Association: Member-at-Large, 2002-2005. National Council on Education in Measurement: Board of Directors, 2002 – 2005, 2009-2012. President 2010-11. Society for Industrial and Organizational Psychology: Member of Executive Committee, 1988- 2003; Council Representative, 1997-2003; Member-at-Large, 1995-97 TESTIMONY: W. Camara 8 California Legislature on Test validity and consequences of subgroup differences in ability testing, 1997. Invited testimony before the National Commission on Testing and Public Policy, 1989 Maine Joint Committee on Education and Cultural Affairs on the subject of Legislative Document No. 843 - H.P. 1283, January 17, 2006. Michigan Senate Education Committee, on the replacement of the MAEP and the use of admissions tests for accountability, April 22, 2004. National Advisory Commission on Work-Based Learning, 1992 – 93. National Assessment Governing Body, panel on testing persons with disabling conditions, October 14, 1998. National Research Council's Committee on National Research Service Awards, May 1993. Nevada Legislative Hearing on College and Career Readiness, Reno, NV., May 2012. New York Assembly Committee, Test Disclosure, 1990. New York Senate Committee, Proposed legislation to regulate admissions testing, 2006. U.S. Congress, House Education and Labor Subcommittee, Goals 2000, 1993. U.S. Congress, House Appropriations Subcommittee, Research Funding in Behavioral Sciences, 1992. 1993. U.S. Congress, Senate and House Committees, Prepared testimony for APA presented on Civil Rights Act, Polygraph Protection Act, Integrity Testing, American 2000, Americans with Disabilities, and Appropriations. U.S. Department of Education Hearings on Common Assessments for College and Career Readiness, November 2009. EXTERNAL GRANTS (PROJECT DIRECTOR): National Assessment Governing Board (NAGB) (2008-10). Co-Project Director. Alignment and linkage of Twelfth grade NAEP and the SAT. Southern Regional Educational Board (2000-01). Project Manager. Design and development of common Algebra assessment and item bank. Office of Educational Research and Improvement (1996-98). Project Manager. Research grant to examine the generalizability and utility of local models for scoring performance assessments. Working collaboratively with six school districts examining different models for local scoring of Pacesetter culminating assessments. Maryland Department of Education (1996-97). Project Director. Contract to design Maryland’s High School Assessment System. Designing requirements and specifications for an end-of-course assessment system for high school graduation and higher education uses. Conducting public engagement with stakeholder groups and advising the state board. W. Camara 9 National Institute of Occupational Health and Safety (1992-1994). Project Director. Cooperative agreement to develop a model interdisciplinary program to train doctoral level psychologists in occupational health psychology and disseminate research on preventive interventions to policymakers, psychologists and researchers. Department of Labor, (July, 1992-93). Project Director. Grant to support a review of methodologies and strategies in cognitive psychology and job analysis appropriate for the next revision of the "Dictionary of Occupational Titles." National Institute on Drug Abuse, (February 1992). Co-Project Director. Examination of awareness and knowledge of the mechanisms for receiving outside funding to support research by recent doctoral degree recipients in psychology. National Science Foundation, Principle Investigator or Co-PI on several contracts related to AP Redesign and Instructional development. PRODUCT DEVELOPMENT: Business and Project Plan for Computerized CLEP Examinations. Award for New Product Development, Educational Testing Service, 1998. Led CB/ETS psychometric/research and redesign teams for the 2005 SAT with writing. Prototype of Non-cognitive assessments for college admissions. Pilot testing in 2007-08 with applicants across 13 colleges. Psychometric Research and Design of AP Potential Software. Product introduced by College Board in 2001 for expanding access in AP Courses and Examinations based on prior accomplishments and test performance. Study Skills Inventory for high school and college freshmen. Prototype completed and product in development, 2005-2009. SELECTED BIBLIOGRAPHY: Camara, W.J., O’Connor, R., Mattern, K., and Hanson, M.A. (Eds.). (2015). Beyond academics: A holistic framework for enhancing education and workplace success. ACT Research Report 2015 (4). Iowa City, IA: ACT. Retrieved from http://www.act.org/research/researchers/reports/pdf/ACT_RR2015-4.pdf Mattern, K., Burrus, J., Camara, W.J., O’Connor, R., Hanson, M.A., Gambrell, J., Casillas. A., and Bobek, B. (2014). Broadening the definition of college and career readiness: A holistic approach. ACT Research Report 2014 (6). Iowa City, IA: ACT. Retrieved from http://www.act.org/research/researchers/reports/pdf/ACT_RR2014-5.pdf [R]Camara, W.J. (2014). Issues facing testing organizations in using the Standards for Educational and Psychological Testing. Educational Measurement: Issues and Practice, 33 (4) 13-15. [R] Camara, W.J. (2013). Defining and measuring college and career readiness: A validation framework for new State consortium assessments. Educational Measurement: Issues and Practice, 32 (4) 16-27. W. Camara 10 [R] Camara, W.J., Packman, S. and Wiley A. (2013).College, graduate and professional school admissions testing. In K. Geisinger (Ed.), Handbook of Testing and Assessment in Psychology (pp. 297-318). Washington, D.C: American Psychological Association. [R] Camara, W.J. and Shaw, E. (2012). Tests, score reports, research and getting along with the media. Educational Measurement: Issues and Practice. [R] Harris, W.G., Jones, J.W., Klion, R., Arnold, D.W., Camara, W.J., and Cunningham, M. R. (2012). Test publisher’s perspective on “An updated meta-analysis” of integrity testing. Journal of Applied Psychology, 97 (3), 531-536. [R]Mattern, K., Kobrin, J., and Camara, W.J. (2012). Promoting Rigorous Validation Practice: An Applied Perspective. Measurement: Interdisciplinary Research and Practice, 10, pp 88-92. Camara, W.J. and Quenemoen, R. (2012). Defining and Measuring College and Career Readiness and Informing the Development of Performance Level Descriptors (PLDs). Commissioned white paper for PARCC. Available at http://www.parcconline.org/sites/parcc/files/PARCC%20CCR%20paper%20v14%2018-12.pdf Wyatt, J., Wiley, A., Camara, W.J., and Proestler, N. (2011). The development of an index of academic rigor for college readiness. College Board Research Report (2011-11). New York, NY: College Board. Available at http://professionals.collegeboard.com/profdownload/pdf/RR2011-11.pdf Luecht, R. and Camara, W.J. (2011) Evidence and design implications required to support comparability claims. Commissioned white paper for PARCC. Available at http://parcconline.org/sites/parcc/files/PARCC_WhitePaper-RLuechtWCamara%5B5%5D.pdf Wyatt, J., Kobrin, J., Wiley, A., Camara, W.J., and Proestler, N. (2011). SAT Benchmarks: Development of a college readiness benchmark and its relationship to school performance. College Board Research Report (2011-5). New York, NY: College Board. Available at http://professionals.collegeboard.com/profdownload/pdf/RR2011-5.pdf Wiley, A., Wyatt, J., and Camara, W.J. (2010). Development of a multidimensional index of college readiness. College Board Research Report (2010-03). New York, NY: College Board. Available at http://professionals.collegeboard.com/profdownload/pdf/10b_3110_CollegeReadiness_RR_WEB_110315. pdf [R] Packman, S.J. Camara, W.J. and ,Huff, K., (2010). A snapshot of industry and academic compensation in educational measurement and assessment. Educational Measurement: Issues and Practice, Fall. [R]Camara, W. J. (2009). Validity Evidence in Accommodations for English Language Learners and Students Disabilities. Journal of Applied Testing Technology, 10 (2). http://www.testpublishers.org/jattmain.htm Mattern, K. Kobrin, J., Patterson, B., Shaw, K. and Camara, W.J. (2009). Validity is in the eye of the beholder: Conveying SAT research findings to the general public. In R. Lissitz (ed.) The Concept of Validity: Revisions, New Directions and Applications. Charlotte, NC: Information Age Publishing Camara, W.J. (2009). College Admission Testing: Myths and Realities in an Age of Admissions Hype. In R. Phelps (Ed.), Correcting fallacies about educational and psychological testing (pp. 147-180). Washington, DC: American Psychological Association. W. Camara 11 [R]Camara, W. J. and Lane, S. (2006). A historical perspective and current views on the Standards for Educational and Psychological Testing. Educational Measurement: Issues and Practice, 25, pp. 35-41. Camara, W.J. (2006). Improving Test Development, Use, and Research: Psychologists in Educational and Psychological Testing Organizations. In R. Sternberg (Ed.), Careers in Psychology. Washington, DC: American Psychological Association. [R]Phillips, S. & Camara, W. J. (2006). Legal and ethical issues in testing. In R. Brennan (Ed.), Educational Measurement (Volume IV) (pp. 733-755). AERA and American Council on Education. Camara, W. J. & Kimmel, E. (Eds.) (2005). New tools for admissions to higher education. Mahwah, NJ: Erlbaum. Camara, W. J. (2005). Broadening criteria of college success and the impact of cognitive predictors in admissions testing (pp. 81-107), In W.J. Camara & E. Kimmel (Eds.), New tools for admissions to higher education. Mahwah, NJ: Erlbaum. Camara, W. J. (2005). Broadening predictors of college success (pp. 53-80). In W.J. Camara & E. Kimmel (Eds.), New tools for admissions to higher education. Mahwah, NJ: Erlbaum. Korbrin, J., Camara, W.J., & Milewski, G. (2004). The utility of the SAT I and SAT II for admissions. In R. Zwick (Ed.), Rethinking the SAT: The future of standardized testing in university admissions (pp. 251-276). New York: Routledge Falmer. Schmidt, A.E., & Camara, W.J. (2004). Group differences in standardized test scores and other educational indicators. In R. Zwick (Ed.), Rethinking the SAT: The future of standardized testing in university admissions (pp. 189-202). New York: Routledge Falmer. [R]Cahalan, C., Mandinach, E. & Camara, W.J. (2003). The impact of flagging on the admissions process. Journal of College Admissions. No. 186. Camara, W.J. (2003). What educators need to know about professional testing standards. In J. Wall & G. Walz (Eds.), Measuring up: Resources on testing for teachers, counselors, and administrators. Greensboro, NC: ERIC/CASS. Noble, J. and Camara, W. (2003). Issues in college admissions testing. In J. Wall & G. Walz (Eds.), Measuring up: Resources on testing for teachers, counselors, and administrators. Greensboro, NC: ERIC/CASS. Camara, W.J., Kimmel, E., Scheuneman, J., and Sawtell, E. (2003). Who’s grades are inflated? College Board Research Report (2003-4). New York: College Board. Available at http://professionals.collegeboard.com/profdownload/pdf/04843cbreport20034_31757.pdf Camara, W.J. (2003). Construct validity. In R. F. Ballesteros (Ed.) Encyclopedia of Psychological Assessment (Volume 2) (pp. 1070-1075). London: Sage Publications. Camara, W.J. (2002). Advances in scoring and inferences concerning examine behavior in computeradaptive testing. In Mills, C., Potenza, M, and Ward, W. (Eds.) Computer-Adaptive Testing: Building a foundation for future assessments. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. W. Camara 12 Linn, R.L., Drasgow, F., Camara, W., Crocker, L., Hambleton, R.K., Plake, B.S., Stout, W. and van der Linden, W.J. (2002). Examinee behavior and scoring computer-based tests. In C. Mills, M. Potenza, J. Fremer and W. Ward (Eds.) Computer-based testing: Building the foundation for future assessment. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Noble, J., Camara, W., and Fremer, J. (2002). Admissions testing and students with disabilities. In Ekstrom, R., and Smith, D. (Eds.) Assessing individuals with disabilities (pp. 173-190). Washington, DC: American Psychological Association. Mandinach, E.B., Cahalan, C. and Camara, W.J. (2002). The impact of flagging on the admissions process: Policies, practices and implications. College Board Research Report (No. 02-02). New York: College Board. Available at http://www.ets.org/Media/Research/pdf/RR-05-20.pdf Cahalan, C., Mandinach, E.B., and Camara, W.J. (2002). Predictive validity of STA I: Reasoning test for test takers with learning disabilities and extended time accommodations. ETS Research Report (No. 02-03). Princeton, NJ: ETS. Available at http://www.ets.org/Media/Research/pdf/RR-02-03-Mandinach.pdf [R]Scheuneman, J.D., Camara, W.J., Cascallar, A.S., Wendler, C., and Lawrence, I (2002). Calculator access, use and type in relation to performance on the SAT I: Reasoning test in mathematics. Applied Measurement in Education, 15 (1), 95-112. Camara, W.J. and Echternacht (2000). The SAT I and high school grades: utility in predicting success in college. College Board Research Note (RN-10). New York: College Board. Camara, W. (2000). Using class rank alternative plans for college admissions. Association of American Colleges and Universities: Diversity Digest, Summer, 8-10. [R]Camara, W.J., Dorans, N., Morgan, R. and Myford, C (2000). Advance Placement: Access not quality. Educational Policy Analysis Archives. 8 (40). Online journal, available: http://epaa.asu/edu/epaa/v8n40.html [R]Camara, W.J. and Merenda, P.M. (2000). Using personality tests in preemployment screening: Issues raised in “Soroka v. Dayton-Hudson Corporation.” Psychology, Public Policy and the Law, 6, (4), 1-23. [R]Camara, W., Puente, A., and Nathan, J. (2000). Psychological test usage: Implications in professional psychology. Professional Psychology: Research and Practice, 31 (2), 141-154. Camara, W. and Schmidt, A. (1999). Group differences in standardized testing and social stratification. College Board Research Report (No. 99-5). New York: College Board. [R]Schneider, D.L., Camara, W.J., Tetrick, L. and Sternberg, C. (1999). Training in occupational health psychology: Initial efforts and alternative models. Professional Psychology: Research and Practice, 30 (2), 138-142. Nathan, J., and Camara, W.J. (1999). Concordance of the examine performance on the SAT and ACT. College Board Research Note (RN99-7). New York: College Board. Powers, D. and Camara, W. (1999). Coaching and the SAT I. College Board Research Summary (RN99-6). New York: College Board. W. Camara 13 Camara, W.J. and Millsap, R. (1998). Using the PSAT/NMSQT and course grades in predicting success in Advanced Placement. College Board Research Report (RR98-5). New York: College Board. Camara, W.J., Copeland, T. and Rothschild, B. (1998). Effects of extended time on the SAT I: Reasoning Test score growth for students with learning disabilities. College Board Research Report (No. 98-7). New York: College Board. Nathan, J., and Camara, W.J. (1998). Score change when retaking the SAT I Reasoning Test. College Board Research Note (RN98-5). New York: College Board. Camara, W. (1998). High school grading policies. College Board Research Note (RN98-4). New York: College Board. Smith, R. and Camara, W. (1998). Block schedules and student performance on AP examinations. College Board Research Note (RN98-3). New York: College Board. Camara, W., Kimmel, E., et. al., (1997). Design of a high school assessment system. (Vols. I and II). Technical Report of the College Board and ETS. Baltimore, MD. Maryland State Department of Education. [R]Camara, W. J. (1997). Educational assessment: Responsible uses and professional dilemmas. European Journal of Psychological Assessment, 13 (2), 140-152. Camara, W.J. and Kraiger, K. (1997). Organisational infrastructure for selection and assessment in the USA. In Smith, M & Sutherland, V. (Eds.). Professional issues in selection and assessment (pp. 139-146). Wiley: London. Camara, W. (1997). Validity, Fairness, and Public Policy of Employment Testing: Influences of the American Psychological Association (pp. 3-11). In Barrett, R., Fair employment strategies. Camara, W., Kimmel, E. and colleagues (1997). Models for the Design of Maryland’s High School Assessments. College Board Technical Report (CBTR97-1). [R]Camara, W.J. and Schneider, D. (1995). Questions of construct breadth and openness of research on integrity tests. American Psychologist, 47 (3). [R]Camara, W. J. and Brown, D. (1995). Educational and employment testing: Changing concepts in measurement and policy. Educational Measurement: Issues and Practice, 14 (1), 5-12. Camara, W. J. (1995). APA involvement in employment testing policy and litigation: An historical overview. Unpublished manuscript. [R]Camara, W.J. and Schneider, D. (1994). What we know and still don't know about integrity tests. American Psychologist, 47 (3). Camara, W.J., and Baum, C. (1993). Developing careers in research: Knowledge, attitudes and intentions of recent doctoral recipients in psychology. (Final report 92MF04400101D) Rockville, MD, National Institute of Drug Abuse. W. Camara 14 Camara, W.J. (1992). Fairness and "fair-use" in employment testing: A matter of perspectives (pp. 215233). In Geisinger, K, Testing of Hispanics. Washington, DC: APA [R]Camara, W.J. (1991). A national exam: Has its time come? Child Behavior & Development, 7 (9-10). [R]Camara, W.J., et. al. (1990). Enhancing psychological science: A report by the Science Advisory Committee. American Psychologist, 45 (7). [R]Fremer, J., Diamond, E., and Camara, W. (1989). Developing a "Code of Fair Testing in Education." American Psychologist, 44 (7), 1062-1067. [R]Bond, L., Camara, W.J., and VandenBos, G.R. (1989). Psychological test standards and clinical practice. Hospital and Community Psychiatry, 40 (7), 687-693. Camara, W.J. (1989). Detecting dishonest employees: What is the state of the art? Proceedings of the Second Annual National Assessment Conference, (pp.26-28) University of Minnesota and Personnel Decisions Inc., Minneapolis, MN. Camara, W.J., Kuhn, D., and Ziemak, J. (1987). Development and training of Army civilian first-line supervisors. (Final Report FR-87-36). Alexandria, VA. Human Resources Research Organization. [R]Waters, B.K., Laurence, J.H., & Camara, W.J. (1987). Personnel enlistment and classification procedures in the U.S. Military. Washington, D.C.: National Academy of Science Press. Camara, W.J., & Laurence, J.H. (1987). Military classification and high aptitude recruits (Technical Report TR-PRD-87-16). Alexandria, VA. Human Resources Research Organization. Camara, W.J. (1986). The effects of job previews on self-selection decisions. Dissertation Abstracts International, 47, DA8623268. Camara, W.J. (1984). Assessment centers: A critical review of the literature. Unpublished Paper, Champaign-Urbana: University of Illinois. Camara, W.J. (1983). Personnel selection: A classification and review of techniques. Unpublished Masters Thesis, Champaign-Urbana: University of Illinois. Camara, W.J. (1981) Infusion - inservice: Career awareness. A Massachusetts guide: Promising practices in career education. Boston, MA: Department of Education. SELECTED PRESENTATIONS: Camara, W.J. (2015). Employing empirical data in judgmental processes. Paper presented at the National Conference on Student Assessment, San Diego, CA. Camara, W.J. and Westrick, P. (2015). Admissions testing in the United States. Invited presentation at the Annual Meeting of the American Educational Research Association in Chicago, IL. Camara, W.J. (2015). "Evidentiary basis related to claims concerning college and career readiness." Colloqium, University of Massachusetts, Amherst, Graduate Program in Education. W. Camara 15 Camara, W.J. (2015). Overview of the 2014 Revision of the ‘Standards for Educational and Psychological Testing.’ Paper presented at the Association of Test Publishers, Palm Springs, CA. Camara, W.J. (2014). Test security: Prevention-detection-investigation. Workshop presentation for the Minnesota State Department of Education, Offices of Assessment and Accountability. Camara, W.J. (2014). How has our approach to test security evolved and where are we headed. Paper presented at the Conference on Test Security, Iowa City, IA. Camara, W.J. (2014). Developing sources of validation evidence across assessment settings. Invited presentation at the International Testing Commission, San Sebastian, Spain. Camara, W.J. and Shaw, D. (2014). Use of comment codes during performance scoring to provide formative feedback. Paper presented the National Conference on Student Assessment, New Orleans, LO. Camara, W.J. (2014). Employing empirical data in judgmental standard setting processes. Paper presented at the Annual Meeting of the Society for Industrial and Organizational Psychology, Honolulu, HI. Camara, W.J. (2014). Fisher v. University of Texas: The future of affirmative action. Participant in panel at the Annual Meeting of the Society for Industrial and Organizational Psychology, Honolulu, HI. Camara, W.J. (2014). AERA Vice-Presidential Symposia: Technology Enhanced Items in Large Scale Assessments. Annual Meeting of the American Educational Research Association, Philadelphia, PA. Camara, W.J. (2013). PISA’s use for international benchmarking and comparisons of post-secondary readiness. Invited panel, Oxford University. Camara, W.J. (2013). College and career readiness: Criterion-related outcomes. Invited address at the Maryland Assessment Research Center for Educational Success, University of Maryland at College Park. Camara, W.J. (2013). Implications of consortia assessments for Higher Education. Paper presented at the National Conference on Student Assessment at the National Harbor, MD. Camara, W.J. (2013). Innovations in psychometrics and assessment. Developing college and career readiness assessments. Workshop at the National Center for Measurement in Education, San Francisco, CA. Camara, W.J. (2012). Admissions practices and college going in the U.S. Invited presenter at the International Conference on Assessment and Evaluation, Riyadh, Saudi Arabia. National Center on Assessment in Higher Education. Reshetar, R., and Camara, W. J. (2012). Redesigning the Advanced Placement Science Assessments application of evidence centered design. Invited Panelist at the National Research Council Workshop. Camara, W.J. (2012). College and career readiness: Establishing validation evidence to support the use of new assessments. Invited lecture at the Pearson Center for Applied Psychometric Research, University of Texas at Austin. W. Camara 16 Camara, W.J. (2012). Defining and measuring college and career readiness: Developing Performance level descriptors and defining criteria. Paper presented at the Annual Meeting of the National Council on Measurement in Education, Vancouver, Canada. Camara, W.J. (2012). Invited panel presentation on data integrity and cheating. National Center on Educational Statistics. Sponsored Symposium on Testing Integrity, Washington, D.C. Camara, W.J. (2011). College and career readiness: An initial validation argument. Paper presented at the National Conference on Student Assessment, Orlando, FL. Camara, W.J. (2011). Developing and expanding state K-20 longitudinal data systems: Common core state standards and consortia assessments. Paper presented at the National Conference on Student Assessment, Orlando, FL. Camara, W.J. (2011). The revised testing standards: Potential impact and consequences for assessments in employment and business settings. Invited address at the International Personnel Assessment Council, Washington, DC. Camara, W.J. (2011). College and career readiness standards and assessments: An initial validation argument. Paper presented at the CCSSO National Conference on Student Assessment, Orlando, FL. Camara, W. J. (2011). Empirical benchmarks in a judgmental standard setting process. Paper presented at the Annual Meeting of the National Council on Measurement in Education, New Orleans, LA. Camara, W. J. (2011). Uncovering Educational Measurement & Assessment Professionals: Demographics, Education, Experience and Engagement. Presidential Address at the Annual Meeting of the National Council on Measurement in Education, New Orleans, LA. Camara, W.J. (2011). Formative assessment: Implications of the common core on classroom assessment. Invited address at the Annual Meeting of the American Educational Research Association, Classroom Assessment SIG. Camara, W. J., Wiley, A., Wyatt, J., and Kobrin. J. (2011). College readiness benchmarks. Paper presented at the Annual Meeting of the National Council on Measurement in Education, New Orleans, LA. Camara, W.J. (2010). Validating claims and evidence related to student college and career readiness: Lessons learned from higher education. Invited presentation at the Annual CCSSO Policy Meeting, Louisville, KY. Camara, W.J. (2010). Developing benchmarks for college and career readiness. Paper presented at the Annual Meeting of the National Council on Measurement in Education, Denver, CO. Camara, W.J. (2010). Multidimensional models of college readiness. Paper presented at the Large Scale Assessment Conference, Detroit, MI. Camara, W. J. (2010). Progress in revising the Standards for Educational and Psychological Testing. the Annual Conference of the American Psychological Association, San Diego, CA. Camara, W.J. (2009). Operational Issues in Developing National Admissions Testing and College Credit Testing Programs in the U.S. Invited Colloquium at the University of Aachen, Germany. W. Camara 17 Camara, W.J. (2009). Common Core Standards and Coordinated State Assessment. Invited Symposium at the Annual Meeting of the American Educational Research Council, Denver, CO. Camara, W.J. (2009). You can get there from here: Innovation in Educational assessment and linking accountability tests. Invited address at the National Conference of State Legislators, Washington, DC. Camara, W. J. (2009). Noncognitive assessments in college admissions. Paper presented at the Annual Conference of the American Psychological Association, Toronto, Canada. Camara, W., Kobrin, J., Mattern, K., Patterson, B., and Shaw, E. (2008). The Long and Winding Road: Researching the Validity of the SAT. Invited paper at the 9th annual conference of the Maryland Assessment Research Center for Education Success (MARCES), College Park, MD. Camara, W. J. (2008). Innovations in assessment. Presenter at the Invitational Conference Educational Testing in America: State Assessment, Achievement Gaps, Federal Policy and Innovations, Sponsored by ETS and the College Board, Washington, DC. Camara, W. J. (2008). College readiness vs college admissions: Will we ever resolve the chasm between the K-12 and Higher Education? Invited address at Invitational Conference on Defining Enrollment in the 21st Century, sponsored by the University of Southern California’s Center for Enrollment Research, Policy and Practice. Camara, W. J. (2008). Diversity in admissions. Invited Address at the ETS Conference of Institutional Researchers, Measuring Success and Making Assessment Data Work at Your Institution, Princeton, NJ. Camara, W. J. (2008). The educational measurement profession: state of our art. Presentation at the annual meeting of the National Council on Measurement in Education, New York, NY. Camara, W. J. (2007). Protecting test takers. Invited Presidential Symposium at the Annual Conference of the American Psychological Association, San Francisco, CA. Camara, W. J. (2007). Revising the standards for educational assessment. Invited Symposium at the Annual Meeting of the American Educational Research Association, Chicago, IL. Camara, W. J. (2006). Using norm referenced tests for accountability under NCLB. Presenter at the Annual Meeting of the National Association of Collegiate Admissions Counselors, Pittsburgh, PA. Camara, W. and Schmidt, A. (2006). University Admissions Practices in the US and the Role of Admissions Tests. Invited Address at UCAS Conference, Nottingham, United Kingdom. Camara, W J. (2005). Constraints in current admissions practices: Impacts on diversity and definition of college success. Invited Address at the Goldman-Sachs Foundation and ETS Symposium on Addressing Achievement Gaps, Princeton, NJ. Camara, W. J. (2005). Update on the new SAT. Annual Meeting of the National Association of Collegiate Admissions Counselors, Tampa, FL. Camara, W. J. (2005). Design and development of the SAT Writing Test. National Council on Measurement in Education, Montreal, Canada. W. Camara 18 Camara, W.J., Kobrin, J., and Sathy, J (2005). Is there an SES advantage on the SAT and college performance? National Council on Measurement in Education, Montreal, Canada. Camara, W. (2004). The Use of Qualitative and Quantitative Data in Admissions. Annual Meeting of the National Association of Collegiate Admissions Counselors, Milwaukee, WI. Laitusus, V. Camara, W. J. and Wang, B. (2004). An examination of differential item functioning for language minorities on a verbal and math reasoning test. National Council on Measurement in Education, San Diego, CA. Camara, W.J. (2004). New predictors in college admissions. Annual Meeting of the Society of Industrial and Organizational Psychologists, Chicago, IL. Camara, W. J. (2003) Current tests and future designs in admissions testing: The new SAT. CASMA-ACT Invitational Conference. Iowa City, IA. Camara, W. J. (2003). Validity and utility of admissions tests. Invited symposium, American Council on Education, Washington, DC. Camara, W.J. (2003). Changes to the SAT: National Association of Collegiate Admissions Counselors. Camara, W. (2003). Making test results more useful and understandable: Advances in diagnostic score reporting. Paper presented at the Annual Meeting of the National Council on Measurement in Education, Chicago, IL. Camara, W. (2002). Revision of the Principles for the Validation and Use of Personnel Selection Procedures, Workshop conducted at the Mid-Atlantic Personnel Assessment Consortium, New York, NY. Camara, W. (2002). Predicting success in employment and education: Uses and limitations of tests and other factors. Invited address, New York Academy of Sciences. Camara, W. (2002). Prediction and testing. Invited address at the CRESST Conference on Assessment, Accountability and Improvement, Los Angeles, CA. Camara, W. (2002). Admissions tests : Use and value in higher education. Invited address the Association of American Universities, Meeting of Presidents and Chancellors, Atlanta, GA. Camara, W. (2002). The future of admissions testing. Annual Meeting of the National Association of Collegiate Admissions Counselors, Salt Lake City. Camara, W. (2002). Fairness in employment testing. Paper presented at the Annual Convention of the American Psychological Association, Chicago, IL. Camara, W. (2002). Testing and admissions in higher education. Invited presentation at the Annual Meeting of the American Association for the Advancement of Science, Boston, MA. Camara, W. (2001). The utility of the SAT I and SAT II for admission at the University of California and the nation. Paper presented at the Invitational Conference on Rethinking the SAT in university admissions, University of California at Santa Barbara. W. Camara 19 Camara, W. (2001). Test preparation on the SAT: Impact on validity. Paper presented at the Annual Meeting of the National Council on Measurement in Education, Seattle, WA. Camara, W. (2001). Utility of the SAT in college admissions. Colloquium at the University of California at Davis. Camara, W. (2001). Do accommodations improve or hinder psychometric qualities of assessment? Presidential address for Division 5 at the Annual Convention of the American Psychological Association, San Francisco, CA. Camara, W. (2000). Future of educational assessment. Paper presented at the Annual Convention of the American Psychological Association, Washington, D.C. Camara, W. (2000). Implications of the revised testing standards for personnel selection. Invited Address at the Annual Conference of the International Personnel Management Assessment Council, Washington. Camara, W. (2000). Performance of test takers with LD or ADD on the SAT and subsequent college behavior. Paper presented at the Annual Meeting of the American Educational Research Association, New Orleans, LA. Camara, W. (2000). Implications of the testing standards in personnel assessment. Presentation at the Annual Meeting of the Society for Industrial and Organizational Psychology, New Orleans, LA. Camara, W. (1999). The revised 'Standards for Educational and Psychological Testing'. Workshop at the Mid-Atlantic Personnel Assessment Consortium, New York, NY. Camara, W. (1999). Retesting on the SAT under standard and non-standard administrations. Presentation at the National Conference of Measurement in Education, Montreal, Canada. Camara, W. (1999). Testing practices in clinical assessment. Paper presented at the American Psychological Association, Boston, MA. Camara, W. (1998). Accommodations for persons with disabilities: results of attempts to establish comparability in cognitive testing. Continuing education workshop at Personnel Testing Council of Metropolitan Washington. Camara, W. (1998). Alternatives to item pattern scoring and use of response-time estimation in computer adaptive testing: Invited Presentation. ETS Invitation Conference on future assessments, Philadelphia, PA. Camara, W. (1998). Future trends in assessment. Presentation at the Annual Conference for the Society for Industrial and Organizational Psychology, Dallas, TX.. Camara, W. (1998). Selection into I-O programs: Focus on GRE validity. Symposium at the Annual Conference for the Society for Industrial and Organizational Psychology, Dallas, TX Camara, W. (1998). Rights and responsibilities of test takers. Presentation at the Annual Meeting of the National Council on Measurement in Education, San Diego, CA. W. Camara 20 Scheuneman, J. and Camara, W. (1998). Analysis of mathematics achievement in Pacesetter program. Presentation at the Annual Meeting of the American Educational Research Association, San Diego, CA. Camara, W. (1998). Evaluating math curricular reform efforts. Presentation at the Annual Meeting of the American Educational Research Association, San Diego, CA. Camara, W. (1998). Psychometric and operational constraints remaining in CBT. Colloquium at Fordham University Graduate Departments of Psychology and Education, New York, NY. Camara, W. (1997). State and district accountability: Uses and misuses of assessments. Presentation at the Large Scale Assessment Conference of the Council of Chief State School Superintendents, Colorado Springs, CO. Camara, W. (1997). Effects of calculator use on performance of on a mathematics admissions test. Panel discussant at the Annual Meeting of the American Educational Research Association, Chicago, IL. Camara, W. (1997). Assessing workplace skills: Public policy and technical considerations. Colloquium at Baruch College, City University of New York. Camara, W. (1996). Effects of extended time on the performance of students with disabilities. Paper presented at the Annual Conference of the National Association of College Admissions Counselors, Minneapolis, MN. Camara, W. (1996). Flagging test scores for students with disabilities: Understanding and using test scores for admissions and placement decisions. College Board National Forum, New York, NY Camara, W. (1996). Adapting/ Translating educational and psychological tests: Issues, technical advances, and guidelines. Panel discussion at the Annual Convention of the American Psychological Association, Toronto, Canada. Camara, W. (1996). Doctoral training in organizations: Comparisons among Business schools and Psychology departments. Panel discussion at the Annual Meeting of the Society for Industrial and Organizational Psychology, San Diego, CA. Camara, W. (1996). SCANS based competencies: Discussion of the national job analysis project. Discussant at the Large Scale Assessment Conference of the Council of Chief State School Superintendents, Phoenix, AZ. Camara, W. (1995). Test speededness and other implications of testing persons with disabilities in largescale programs. Paper presented at the October Meeting of the Personnel Testing Council, Washington, DC., and (1996) Mid-Atlantic Personnel Assessment Consortium, Potomac, MD. Camara, W. (1995). Flagging of test scores: Policies, Data and Opportunities. Panel discussion, ETS Committee for People with Disabilities, Educational Testing Service, Princeton, NJ. Camara, W. (1995). Lessons for test developers from the NACAC Commission on Standardized Testing. Paper presented at the Annual Conference of the National Association of College Admissions Counselors, Boston, MA. W. Camara 21 Camara, W. (1995). Standard setting: A mixed bag of judgment, psychometrics and policy. Paper presented at the Annual Convention of the American Psychological Association, New York, NY. Camara, W. (1995). Federal funding opportunities in Industrial and Organizational Psychology (moderator and presenter). Annual Conference of the Society for Industrial and Organizational Psychology, Orlando, FL. Camara, W. (1995). The New SAT: Reactions (chair) Symposium at the Annual Meeting of the National Council on Measurement in Education, San Francisco, CA. Camara, W. (1994). International perspectives on test use: Options in education and enforcement? Presentation at the 23rd International Congress of Applied Psychology, Madrid Spain. Camara, W, (1994). Developments in creating the new national database of occupational titles (chair and presenter). Society for Industrial and Organizational Psychology, Nashville. Camara, W. (1994). The impact of national testing standards on personnel assessment. Invited presentation at the International Personnel Management Association Assessment Council Meeting, Charleston, SC. Camara, W. (1994). Test standards: Balancing technical, applied and policy issues. Invited presentation, Personnel Testing Council of Washington, DC. Camara, W. (1993). Who should control access and use of Neuropsychological Tests? Invited presentation at the National Academy of Neuropsychology, Phoenix, AZ. Camara, W. (1993). Implications of the "Americans with Disabilities Act" on Assessment, Invited address at International Personnel and Management Association, Sacramento, CA. Camara, W. (1993). Ethical issues in research, teaching, and publication for industrial psychologists (chair, panelist). Society for Industrial and Organizational Psychology, San Francisco, CA. Camara, W. (1993). I/O Psychology in the Public-Policy-Making Process (panelist). Society for Industrial and Organizational Psychology, San Francisco, CA. Lipsitt, L. & Camara, W.J. (1993). The childhood origins of creativity. Paper presented at the Nebraska symposium on gifted children, Lawrence, Kansas. Camara, W. (1992). 100 Years of Psychological Testing (chair). Centennial Convention of the American Psychological Association, Washington, DC. Camara, W. (1992). Occupational Health Psychology: A new specialty for psychology and training needs. Centennial Convention of the American Psychological Association, Washington, DC. Camara, W. (1992). Correlates between personnel and educational assessment in national policy. Invited Address at the Annual Convention of the American Psychological Society, San Diego, CA. Camara, W.J. (1992). Affirmative Action and the Civil Rights Act of 1991. Invited address at International Personnel and Management Association, Baltimore, MD. W. Camara 22 Camara, W.J. (1992). Americans with Disabilities Act and the Civil Rights Act of 1991: Implications for industrial psychologists. Annual Conference of the Society of Industrial and Organizational Psychology, Montreal, Quebec. Camara, W.J. (1992). Scans and America 2000. Annual Conference of the Society of Industrial and Organizational Psychology, Montreal, Quebec. Camara, W.J. (1991). Federal funding opportunities at ADAMHA, the Department of Energy and the Department of Agriculture (moderator). Thirty-sixty Institute on Federal Funding, National Graduate University, Washington, D.C. Camara, W.J. (1990). Disclosure of test scores, items and protocols in educational settings. Paper presented at the 98th Annual Convention of the American Psychological Association, Boston, MA. Camara, W.J. (1991). Integrity testing: Risks and rewards. Invited address, Personnel Testing Council of Washington, DC. Camara, W.J. (1989). Detecting dishonest employees: What is the state of the art? Paper presented at the Annual National Assessment Conference of the University of Minnesota and Personnel Decisions Inc., Minneapolis, MN. Camara, W.J. (1989). Predicting Honesty: Scientific Evidence, Business Necessity, and Social Policy Issues. Paper presented at the 97th Annual Convention of the American Psychological Association, New Orleans, LO. Camara, W.J. (1989). Legal burden in employment selection: Recent court decisions. Symposium at the 4th Annual Convention of the Society of Industrial/Organizational Psychologists, Boston, MA. Camara, W.J. and Kuhn, D. (1988). Development of a mixed standard rating scale for training and development. Paper presented in Division 14 of the 96th Annual Convention of the American Psychological Association, Atlanta, GA. Camara, W.J. (1987). The utility of a job-person match for personnel selection decisions. Paper presented in Division 14 of the 95th Annual Convention of the American Psychological Association, New York, NY. (ERIC Document Reproduction Service No. ED 289 149). Ziemak, J.P., Camara, W.J., Fisher, G.P., and Darmsteadt, G.H. (1987). Development of effective army civilian first-line supervisors. Paper presented at the 29th Annual Military Testing Association Conference, Quebec, Canada. Camara, W.J., Colot, P., Hutchinson, G., & Campbell, B. (1987). The reality of data collection. Paper presented at the 95th Annual Convention of the American Psychological Association, New York, NY. (ERIC Document Reproduction Service No. 290 775). Camara, W.J. (1986). The utility of biodata in predicting military performance. Paper presented at the 26th Annual Military Testing Association Conference, Mystic, CT. W. Camara 23 Camara, W.J. (1986). Effects of job previews on personnel selection. Paper presented at the National Conference of the Association of Human Resources Management and Organizational Behavior, New Orleans, LA. Camara, W.J. (1986). Equivalence of rater sources on job analysis ratings. Paper presented in division 14 at the 94th Annual Convention of the American Psychological Association, Washington, D.C. (ERIC Doc.Reproduction Service No. ED 281 455). Camara, W.J. and Means, B. (1986). Status of low-aptitude accessions following military service. Paper presented at the 94th Annual Convention of the American Psychological Association, Washington, D.C. Additional presentations at national and regional meetings and university colloquium not normally cited. 9/2015

Disclaimer: Justia Dockets & Filings provides public litigation records from the federal appellate and district courts. These filings and docket sheets should not be considered findings of fact or liability, nor do they necessarily reflect the view of Justia.


Why Is My Information Online?