Modify CF dropdown list from external DB based off of another CFs value

I have 2 ticket CFs (Version_No and Product)

i currently have a perl script to update the Version_No CF drop down list
from an external database with a manually set Where value set in the code.
(This all works fine for controlling the drop down list values to show only
the version numbers for the manually set product)

i would now like to modify the code to allow the drop down list to be
updated with values which contain the same catagory as the value set in the
Product CF, similar to using the Categories are based on Field when you
create your own list

To do this i have added the following line

my $dbwherevalue = $self->TicketObj->FirstCustomFieldValue(‘Product’);

but i also get the following error

[error]: Can’t locate object method “TicketObj” via package
"RT::CustomFieldValues::Version_No" at
/opt/rt4/sbin/…/lib/RT/CustomFieldValues/Version_No.pm line 60

would someone be able to advice me what to do to resolve this error

below is my code

-------------------CODE START-----------------------
##Prep

package RT::CustomFieldValues::Version_No;

use strict;
use warnings;
use DBI;
use DBD::Mysql;
use base qw(RT::CustomFieldValues::External);

##List Description

sub SourceDescription {
return ‘Version Numbers’;
}

##VARIABLES

sub ExternalValues {
my $self = shift;

External Database details

my $dbhost = "[HostName]";
my $dbtype = "[DatabaseType]";
my $dbname = "[DatabaseName]"; 
my $dbtable="customfieldvalue";
my $dbcolumn="Name";

my $dbwherecolumn="Category";
#my $dbwherevalue="'Testing1234'";
         my $dbwherevalue =

$self->TicketObj->FirstCustomFieldValue(‘Product’);

my $dbuser = "[UserName]"; 
my $dbpass = "[Password]";

my $i = 0;
    my @res;
    my $Hostname;

##Connect to Database
#Database on remote server
my $dbh = DBI->connect(“DBI:$dbtype:$dbname;host=$dbhost”, “$dbuser”,
"$dbpass",
{ RaiseError => 1 }
);

##Run Search of DB for values you wish in CF List

my $req = "SELECT distinct $dbcolumn FROM $dbtable where

$dbwherecolumn=$dbwherevalue";
my $hreq = $dbh->prepare($req);
$hreq->execute();;
$hreq->bind_columns($Hostname);
while ($hreq->fetch()){
push @res, {
name => $Hostname,
description => $Hostname,
sortorder => $i++,
};
}
return @res;
}

RT::Base->_ImportOverlays();

1;

-------------------CODE END-------------------------

thank you in advanced

Scott

View this message in context: http://requesttracker.8502.n7.nabble.com/Modify-CF-dropdown-list-from-external-DB-based-off-of-another-CFs-value-tp55012.html

I had a similar need. But instead of connecting to an external database
from within RT, I put together 2 scripts to check for content change and
dump and load my custom fields in cron. In my case, I was pulling time
sheet codes which are:
client_project and task_code. Each client_project has specific task_codes
so task_codes is dependent on client_project. Mine is probably way more
complicated than you need and could be done easier in perl. I’m a bash man
at heart so here is my method. They could be run from one script as long
as the server has access to get to both databases. Since custom fields are
added to transactions as the actual field values (and not relational by
id’s) this was the best way foe me to get this done. During the day if the
finance department adds or removes codes, within an hour the RT system is
updated. Also in my case, client_project and task_code are each
concatenated from 4 fields in my original search (fields 1 and 2 make up
client_project and fields 3 and 4 make up task_code). Hope this helps in
some way and apologies for the long message…

The first script is run on the external server and creates a load file.
This script is run every hour between 8am and 5pm from cron.

#!/bin/bash
NEWFILE=/tmp/codes.txt
OLDFILE=/tmp/codes.last
/bin/mv $NEWFILE $OLDFILE
/usr/bin/psql -A -t -c "select field1,field2 from table where criteria like
’your_criteria’ -U postgres_user databasename >$NEWFILE
if /usr/bin/diff $NEWFILE $OLDFILE >/dev/null ; then
echo “NoChanges” >/tmp/codes.status
else
echo “Changes” >/tmp/codes.status
fi

Then on another server, I look at the codes.status file and check for
changes. If there are, I pull over the file and dump and load the
customfieldvalues table where the customfield is in my case 1 and 2. This
script is run 5 minutes after the other one.

#!/bin/bash
HOME=/path/to/scripts
STATUSFILE=/tmp/codes.status
LOADFILE=/tmp/codes.txt
LASTFILE=/tmp/codes.txt
LOGFILE=$HOME/codes.log
PGSERVER=RT_PGSERVERNAME_REDACTED
PGUSER=RT_USER_REDACTED
PGDB=RT_DBNAME_REDACTED
TODAY=date +%Y-%m-%d-%H:%M:%S
echo “Starting script at $TODAY”

Start logging

exec > >(tee $LOGFILE)
exec 2>&1

#Fetch status file
scp root@EXT_SERVERNAME_REDACTED:$STATUSFILE /tmp/
STATUS=cat $STATUSFILE
echo $STATUS > $LOGFILE
if [ $STATUS == “Changes” ] ; then
echo “Making Changes”>> $LOGFILE

Fetch update file

scp root@EXT_SERVERNAME_REDACTED:$LOADFILE /tmp/

Clean up previous sql load files and remove the old custom fields

rm -f $HOME/client_project.*
rm -f $HOME/task_code.*
rm -f $HOME/sequence.tmp
mv $HOME/client_project_backup $HOME/client_project_backup-$TODAY
psql -A -t -c “select * from customfieldvalues where customfield=‘1’” -h
$PGSERVER -U $PGUSER $PGDB >>$HOME/client_project_backup
psql -A -t -c “select * from customfieldvalues where customfield=‘2’” -h
$PGSERVER -U $PGUSER $PGDB >>$HOME/task_code_backup
psql -A -c “delete from customfieldvalues where customfield=‘1’” -h
$PGSERVER -U $PGUSER $PGDB
psql -A -c “delete from customfieldvalues where customfield=‘2’” -h
$PGSERVER -U $PGUSER $PGDB

Add a placeholder to notify users that update is taking place

psql -A -c “insert into customfieldvalues
(customfield,name,creator,created) values (‘1’,‘Tasks are being updated.
Refresh in 2-5 minutes’,‘22’,now())” -h $PGSERVER -U $PGUSER $PGDB

Start numbering

echo “5” >$HOME/sequence.tmp

Parse through load file and capture variables to populate Client/Project

field
OIFS=$IFS
IFS=’
'
for m in cat $LOADFILE
do
CLIENT=echo $m|cut -d"|" -f1
PROJECT=echo $m|cut -d"|" -f2
CLIENTPROJECT="${CLIENT}[${PROJECT}]"
echo “$CLIENTPROJECT” >>$HOME/client_project.tmp
done

Get Unique Client Project Codes to load to database

cat $HOME/client_project.tmp |sort -u >> $HOME/client_project.txt
OIFS=$IFS
IFS=’
'
for c in cat $HOME/client_project.txt
do
NAME=echo $c |cut -d"|" -f1

Send load file info to SQL file for troubleshooting, then update the

database with the new Client Project Values
echo “psql -A -c “insert into customfieldvalues
(customfield,name,creator,created) values (‘1’,’$c’,‘22’,now())” -h
$PGSERVER -U $PGUSER $PGDB” >>$HOME/client_project.sql
psql -A -c “insert into customfieldvalues
(customfield,name,creator,created) values (‘1’,’$c’,‘22’,now())” -h
$PGSERVER -U $PGUSER $PGDB

#Increment sequence file for sorting in the Web GUI
sequence=tail -n1 $HOME/sequence.tmp
SEQUENCE=expr $sequence + 5
CLEANNAME=echo $NAME |sed -e 's/\[/\|/g; s/\]//g'

Using the formatted Client/Project codes, loop through the loadfile and

capture Task codes for each Client/Project code
OIFS=$IFS
IFS=’

for task in cat $LOADFILE|grep $CLEANNAME
do
TASK=echo $task|cut -d"|" -f3
CODE=echo $task|cut -d"|" -f4
CLIENT=echo $task|cut -d"|" -f1
PROJECT=echo $task|cut -d"|" -f2
CLIENTPROJECT="${CLIENT}[${PROJECT}]“
TASKCODE=”${TASK}[${CODE}]"
echo "psql -A -c "insert into customfieldvalues
(customfield,name,creator,created,category,sortorder) values
(‘2’,’$TASKCODE’,‘22’,now(),’$CLIENTPROJECT’,’$SEQUENCE’)" -h $PGSERVER -U
$PGUSER $PGDB" >>$HOME/task_code.sql
psql -A -c “insert into customfieldvalues
(customfield,name,creator,created,category,sortorder) values
(‘2’,’$TASKCODE’,‘22’,now(),’$CLIENTPROJECT’,’$SEQUENCE’)” -h $PGSERVER -U
$PGUSER $PGDB
echo $SEQUENCE >$HOME/sequence.tmp
done
done
psql -A -c “delete from customfieldvalues where name=‘Tasks are being
updated. Refresh in 2-5 minutes’” -h $PGSERVER -U $PGUSER $PGDB
echo “Complete” >>$LOGFILE
else
echo “No Changes”>>$LOGFILE
fi